Start Submission Become a Reviewer

Reading: NPs in German: Locality, theta roles, possessives, and genitive arguments

Download

A- A+
Alt. Display

Research

NPs in German: Locality, theta roles, possessives, and genitive arguments

Authors:

Antonio Machicao y Priemer ,

Humboldt-Universität zu Berlin, Unter den Linden 6, 10099 Berlin, DE
X close

Stefan Müller

Humboldt-Universität zu Berlin, Unter den Linden 6, 10099 Berlin, DE
X close

Abstract

Since Abney (1987), the DP-analysis has been the standard analysis for nominal complexes, but in the last decade, the NP analysis has experienced a revival. In this spirit, we provide an NP analysis for German nominal complexes in HPSG. Our analysis deals with the fact that relational nouns assign case and theta role to their arguments. We develop an analysis in line with selectional localism (Sag 2012: 149), accounting for the asymmetry between prenominal and postnominal genitives, as well as for the complementarity between higher arguments and possessives, providing a syntactic and semantic analysis.

How to Cite: Machicao y Priemer, A., & Müller, S. (2021). NPs in German: Locality, theta roles, possessives, and genitive arguments. Glossa: A Journal of General Linguistics, 6(1), 46. DOI: http://doi.org/10.5334/gjgl.1128
171
Views
43
Downloads
1
Twitter
  Published on 12 Apr 2021
 Accepted on 24 Nov 2020            Submitted on 08 Oct 2019

1 Introduction

DP or not DP, that is the question. We provide an analysis of nominal complexes (NCs)1 by means of an NP analysis in Head-Driven Phrase Structure Grammar (HPSG). That is, we account for structures with prenominal and postnominal genitive phrases (PreGens and PostGens) such as (1a) and (1b) assuming that the head of the construction is the noun (NP analysis), not the determiner (DP analysis).

    1. (1)
    1. a.
    1. Hegels
    2. Hegel.GEN
    1. Beschreibung
    2. description
    1. der
    2. the.GEN
    1. Welt
    2. world.GEN
    1. (DECOW2)
    2.  
    1. ‘Hegel’s description of the world’
    1.  
    1. b.
    1. die
    2. the
    1. Beschreibung
    2. description
    1. {Hegels
    2. Hegel.GEN
    1. / der
    2.   the.GEN
    1. Welt}
    2. world.GEN
    1. (DECOW)
    2.  
    1. ‘the description {by Hegel / of the world}’

The NP analysis takes the noun as the element determining the internal structure as well as the distributional properties of the whole construction. The main approach to analyse NCs like (1) changed in the linguistic literature from an NP (cf. Chomsky 1970; Jackendoff 1977; Szabolcsi 1983) to a DP analysis at least since Abney’s dissertation (1987) becoming the most widespread account for NC structures in the 90s and the beginning of the 2000s (Abney 1986; Hellan 1986; Hudson 1987; Haider 1988; Bhatt 1990; Olsen 1991; Vater 1991; Szabolcsi 1994; Netter 1994; Adger 2003; Alexiadou et al. 2007; Heck & Müller 2006; Salzmann 2020; a.o.). This change towards DP was driven partly by theory-internal assumptions, for instance that all phrases have the same structure qua X-bar theory. Nevertheless, part of the linguistic community – mostly outside MGG3 approaches advocated the NP analysis (Baltin 1989; Pollard & Sag 1994: 359–376; Van Langendonck 1994; Sadler & Arnold 1994; Demske 2001: Section 4; Payne & Huddleston 2002; Van Eynde 2006; 2020b; Müller 2013: 88–88; 2020; Machicao y Priemer 2017; a.o.) and in the last decade, even in MGG approaches, the nP/NP analysis experienced a revival (Bruening 2009; 2020; Chomsky 2007; Georgi & Müller 2010; Chomsky et al. 2019; a.o.).

This article is organised as follows: Section 2 gives an outline of the development of the DP analysis showing how empirical evidence regarding the parallelism between sentences and NCs combined with theory-internal assumptions leads to the DP analysis. In Section 3, we discuss some DP approaches in German. We focus the discussion on (i) the element proposed as D head, (ii) the realisation of nominal arguments, and (iii) the structures proposed. As a kind of introduction to our analysis, we first give a short overview of HPSG (Section 4). In Section 5, we show how an NP approach can account for (i) the realisation of “external” and “internal”4 arguments of the head noun, (ii) theta role and case assignment, (iii) the asymmetry between external and internal arguments of a noun, (iv) the role of possessives within NCs, and finally (v) the semantic composition of N head and arguments. Furthermore, our analysis discusses (and allows for) complex and recursive structures in prenominal position which have been neglected in the literature on German PreGens. In Section 6, we compare our analysis with other possible NP approaches in HPSG.

2 From NP to DP: A summary

Within Transformational Grammar, the parallel between nouns and verbs had already been pointed out by Chomsky (1965: 184–192). The underlying question is – taking the relation between head and arguments for granted – how to arrive from one form (e.g. destroy in (2a) to another (e.g. destruction in (2b)), and how the relation between these words must be accounted for.

    1. (2)
    1. a.
    1. They destroy the property.
    1.  
    1. b.
    1. their destruction of the property

At the beginning of the X-bar theory (cf. Chomsky 1970), the main focus is on the comparison between the structure of sentences and NCs. The idea underlying a parallel analysis of both is that a structural position correlates with function. Interestingly, the parallel made at this stage is between determiner and auxiliary verb (cf. (3) and (4)), with both functional elements occupying the specifier position of lexical phrases, NP or VP respectively (cf. Chomsky 1970: 210).5

    1. (3)
    1. John proved the theorem.
    1. (4)
    1. John’s proof of the theorem

In the further development of X-bar theory (cf. Jackendoff 1977: 36–37), the concepts of headedness and endocentricity are strengthened. In comparison to the earlier stage, it is assumed that a sentence (S) is a projection of the lexical head V (Vʹʹʹ in (5)) and not an exocentric structure (cf. (3)). This change leads to a welcomed parallel since the subject-of-relation can be modelled as a head-specifier configuration in NCs and sentences (Jackendoff 1977: 38–41; 165–166). That is, if John has the same interpretation in (5) and (6), then it has to be in a similar structural relation to the head of the dominating phrase in both cases.6 Hence, this analysis strengthens the idea of structural positions reflecting syntactic functions.

    1. (5)
    1. John has proven the theorem.
    1. (6)
    1. John’s proof of the theorem

The further development of this approach strengthens the constraints imposed on structures, for instance w.r.t. binary structures and maximal projections (cf. Chomsky 1986: 2–4; Chomsky & Lasnik 1993: 527–533). Furthermore – and that is of significant relevance for the DP analysis – the functional categories COMP and INFL are introduced (cf. Chomsky 1981: 18–19; 1986: 4–6). INFL is a functional category replacing the auxiliary complex (for tense, agreement, and modality) on earlier stages of the theory (cf. (5)) and marking the relation between a subject and its predicate. COMP, on the other hand, is justified by the necessity of a position for complementizers (e.g. that) preceding the subject, as well as for wh-elements normally preceding a “subject-position” (Chomsky 1986: 5). COMP and INFL display the complete phrasal structure of lexical elements such as verbs or nouns (cf. (7) and (8)), following the principles stated by X-bar theory (Chomsky & Lasnik 1993: Section 3.2), in particular: (i) all phrases (of all categories) have the same structure, (ii) a head is not optional, and (iii) every phrase can have only one element identified as its head. A welcome effect of these principles is that sentences can be analysed as endocentric constructions and there is no need to analyse them as “defective categories” (Chomsky 1986: 5), e.g. as exocentric structures or C′ structures without phrasal level. Following these assumptions, taking functional elements (I in (7), C in (8), cf. Chomsky 1986: 73) as the head of sentential structures, and taking the observation that NCs and sentences show a parallel structural behaviour, logically leads to the analysis of the functional category D as head of NCs. This analysis is worked out in Abney (1987), but the basic idea can be found in the literature since the 1970s in some cases following the line of thought just presented (cf. Lyons 1977: 392; Vennemann & Harlow 1977; Brame 1982; Hellan 1986; Abney 1986; a.o.).7 The DP analysis was embraced by the linguistic community – not only in MGG approaches8 – and became the standard analysis for NCs since then.

    1. (7)
    1. Mary saw Penny.
    1. (8)
    1. that Mary saw Penny

As the development of the theoretical assumptions shows, it is closely linked to the development of framework internal assumptions, such as the axioms of X-bar theory, their assumption as a universal part of our language competence, and the strong correlation of position and function. In the case at hand, the parallel between sentences and NCs as well as between determiner–NC and subject–sentence relations have been recognised in more surface-oriented frameworks such as HPSG too (cf. Sag et al. 2003: 64). But one main difference between the two types of frameworks lies in the way of how pieces of linguistic information are modelled. While MGG approaches model single morphosyntactic or semantic functions (e.g. number, gender, tense, agreement, etc.) in different syntactic heads (cf. Alexiadou et al 2007: 227–235) inducing phrasal structures, surface-oriented frameworks deal with these functions as parts of observable linguistic objects such as words, affixes, etc. (cf. Section 4). In some cases, this competition for a “better approach”, i.e. accounting for more data with a more economic system, leads to a rethinking of the model used. For instance, the necessity of an IP-analysis for German clauses has been challenged by many researchers, also in the MGG tradition (Bayer & Kornfilt 1989; Haider 1993; a.o.). Thus, without an IP in German, the structural IP/DP parallel cannot be proposed, neither in German nor as part of UG.9 In Minimalism, since Hauser, Chomsky & Fitch (2002), the concept of UG has changed from a rich UG (comprising different modules such as X-bar theory, Case Principle, Theta Criterion, etc.) to a maximally simple UG reduced only to context-free recursion (cf. Richards 2015: 804–805).10 This theoretical change dissociated from X-bar theory, some assumptions about functional projections, etc., leads to a gradual rethinking towards an nP/NP analysis (cf. Section 3.2; Chomsky 2007: 25–26; Bruening 2009; 2020; Georgi & Müller 2010: 1–3; Chomsky et al. 2019: 22; a.o.). In the next section, the outline of the theoretical development just presented will focus on DP analyses in German, and especially on the realisation of PreGens.

3 Discussing DP analyses in German

In German, the prenominal position can be occupied by an element in the genitive – similar to English. PreGens can be syntactically simple or complex (9) and they are in complementary distribution with determiners (10). This complementarity suggests that determiners and PreGens compete for the same position, or at least have “something in common” syntactically (cf. Adger 2003: 257–258 for English). In the following sections, we discuss the realisation of the D head, case and theta role marking, and locality aspects of DP approaches for German NCs.

    1. (9)
    1. {Jacobs
    2.   Jacob.GEN
    1. / Des
    2. the.GEN
    1. Arztes}
    2. doctor.GEN
    1. Behandlung
    2. treatment
    1. war
    2. was
    1. erfolgreich.
    2. successful
    1. ‘{Jacob’s / The doctor’s} treatment was successfull.’
    1. (10)
    1. a.
    1. *Jacobs
    2.   Jacob.GEN
    1. die
    2. the.F
    1. Behandlung
    2. treatment.F
    1.  
    1. b.
    1. *die
    2.   the.F
    1. Jacobs
    2. Jacob.GEN
    1. Behandlung
    2. treatment.F
    1.   Intended: ‘Jacob’s treatment’ or ‘the treatment of Jacob’

3.1 DP proposals with PreGens

One way to analyse German PreGens is to give a parallel analysis to Abney’s (1987)11 for English (cf. (11)). This has been proposed in Olsen (1991: 47–51). In her analysis, the morpheme -s occupies the D position, creating a possessive relation between the specifier Jacob and the complement Behandlung ‘treatment’.12 Olsen’s (1991: 48) analysis treats -s not as morphological realisation of genitive case, but as a possessive head (cf. also Haider 1988: 52–55), in this case a determiner assigning case and theta role to the DP in specifier position (cf. (12)).

    1. (11)
    1. (12)

Hartmann & Zimmermann (2003) propose a similar approach. Given that most PreGens are proper nouns (or proper-noun like), i.e. not complex, they propose – in contrast to Olsen – that the complete PreGen, e.g. Jacobs in (12), is the D head. In the less frequent cases where a complex PreGen appears (cf. (13)), they are reanalysed as heads.13 In their analysis, the prenominal genitive marker -s is not (morpho-)syntactic case, but only semantic, in contrast to PostGens, which – according to them – bear morpho-syntactic genitive marking (1b). This raises the following question w.r.t. complex PreGens: Before phrase-to-head reanalysis, what licenses the combination of a determiner, which is obviously in the genitive, with a caseless noun (cf. (13) for the representation in Hartmann & Zimmermann 2003: 180)? For this analysis, a very specific and idiosyncratic construction is needed.

    1. (13)

Both analyses are similar, postulating that the genitive marker -s in PreGens is (i) not a case marker and (ii) attached to the (complex) phrase preceding it. These analyses are in that sense parallel to Abney’s.14 But German and English display some differences w.r.t. their PreGens. Genitive marking in German is morphological case marking, but a phrasal marker in English (cf. Haider 1988: 36; Pollard & Sag 1994: 53; Adger 2003: 256–259; Roeper & Snyder 2005: 164). The s marker is attached to the whole phrase (14a) in English, but not in German, where all elements of the phrase are genitive marked (cf. (14b) vs. (15)).

    1. (14)
    1. a.
    1.   [the queen of England]’s hat          (Haider 1988: 36)
    1.  
    1. b.
    1. *[die
    2.     the.NOM
    1. Königin
    2. queen.NOM
    1. von
    2. of
    1. England]s
    2. England
    1. Hut
    2. GEN
    1.  
    2. hat
    1.   Intended: ‘the queen of England’s hat’

Furthermore, as argued in Vater (1991: 22–24), the elements in the PreGen do not necessarily have to be marked with a genitive -s, but they belong to their common inflectional paradigm. For instance, Kaiser ‘emperor’ belongs to the strong inflectional class, i.e. it has an -s morpheme in the genitive (cf. (15c) from Haider 1988: 37), but Biograph ‘biographer’ belongs to the weak inflectional class, i.e. marking the genitive with -en (cf. (15a) from Vater 1991: 23), and Königin ‘queen’ does not bear genitive marking (cf. (15b) from Lühr 1991: 42).15

    1. (15)
    1. a.
    1. [des
    2.   the.GEN
    1. Biograph-en]
    2. biographer-GEN
    1. Hinweis
    2. hint
    1. ‘the biographer’s hint’
    1.  
    1. b.
    1. [der
    2.   the.GEN
    1. Königin
    2. queen.GEN
    1. von
    2. of
    1. England]
    2. England
    1. unermäßlicher
    2. immense
    1. Reichtum
    2. wealth
    1. ‘the queen of England’s immense wealth’
    1.  
    1. c.
    1. [des
    2.   the.GEN
    1. Kaiser-s]
    2. emperor-GEN
    1. neue
    2. new
    1. Kleider
    2. clothes
    1. ‘the emperor’s new clothes’

Thus, assuming (12) or (13) as the structure for German NCs would be problematic since the PreGens in (15) are, in fact, in the genitive and not in a possessive case or a semantic genitive. Analysing a genitive morpheme as the D head implies that the whole DP is in the genitive (cf. Van Eynde 2006: 141; 2020b: 12; Machicao y Priemer 2017: 234; 2018b). But as (16) shows, Jacobs Nachbar ‘Jacob’s neighbour’ cannot be the object of a verb selecting a genitive NC, but the subject of tanzen ‘to dance’, i.e. it is a phrase in the nominative.16

    1. (16)
    1. a.
    1. *Wir
    2.   we
    1. gedenken
    2. remember
    1. [Jacobs
    2.   Jacob.GEN
    1. Nachbar].
    2. neighbour
    1.   Intended: ‘We remember Jacob’s neighbour.’
    1.  
    1. b.
    1. [Jacobs
    2.   Jacob.GEN
    1. Nachbar]
    2. neighbour.NOM
    1. tanzt
    2. dances
    1. sehr
    2. very
    1. gern.
    2. willingly
    1.   ‘Jacob’s neighbour likes to dance.’

Reviewing these data w.r.t. distribution and case-marking, it can be said that German and English are similar in allowing a complementary distribution between determiners and NCs in prenominal position. But they are different w.r.t. the nature of their genitive markers, i.e. phrasal vs. morphological.

Sternefeld (2015: 209–213, 2009: 587–589) takes a different approach in proposing an analysis with an empty element in the D position. In possessive constructions, the PreGen is base generated as the specifier of DP (17), and argumental PreGens would have to be moved to the specifier of DP (18).17

    1. (17)
    1. (18)

The empty D marks the DP in specifier position (i.e. the PreGen) with genitive. This account has two advantages compared to Olsen’s and Hartmann & Zimmermann’s. First, every single element in the specifier position of the DP can bear morphological genitive (cf. ‘the.GEN biographer.GEN’ in (15a)) since the empty D assigns genitive to the whole phrase in its specifier. Second, complex PreGens are possible since the prenominal position for PreGens is a phrasal position. In contrast, Hartmann & Zimmermann (2003: 180) assume the reanalysis of two single heads into one (e.g. Ddes + NBlauwal in (13)), which is again combined with a further head, the -s marker, building a new D head, i.e. the determiner of the noun.

Complex syntactic objects in prenominal position in German are a controversial topic (for discussion, see Section 5.5). In any case, examples from the literature as well as corpus data (19) show that complex PreGens are possible, challenging approaches that analyse PreGens as syntactically simple objects.

    1. (19)
    1. a.
    1. [eines
    2.   a.GEN
    1. jeden
    2. every.GEN
    1. Mannes]
    2. man.GEN
    1. Zier
    2. adornment
    1. (Sternefeld 2015: 212)
    2.  
    1. ‘every man’s adornment’
    1.  
    1. b.
    1. auf
    2. on
    1. [des
    2.   the.GEN
    1. guten
    2. good.GEN
    1. Doktors]
    2. doctor.GEN
    1. Gesicht
    2. face
    1. (DECOW)
    2.  
    1. ‘on the good doctor’s face’
    1.  
    1. c.
    1. [eines
    2.   a.GEN
    1. großen
    2. big.GEN
    1. Meisters]
    2. master.GEN
    1. Lebenswerk
    2. life’s.work
    1. (DECOW)
    2.  
    1. ‘a big master’s oeuvre’

Sternefeld’s account overcomes some difficulties of Olsen’s and Hartmann & Zimmermann’s analysis w.r.t. complex PreGens and case assignment, but it encounters other problems of more general nature holding for all DP analyses, closely related to the locality issue explained in the next section.

3.2 Locality

Locality is a restriction on dependency relations in structures of natural languages (e.g. subcategorisation). Local relations are restricted to a specific domain, the boundary of this locality domain is normally the maximal projection XP of the X-head creating the dependency relation (cf. Muysken 1982: 64). Taking (20) as an example, X0 is in a local relation with its arguments YP and ZP, and can have access to their properties since they are all in the same locality domain, XP. But neither can X0 have direct access to the properties of YP’s or ZP’s constituents, e.g. α or β, nor Z0 to the properties of YP since they are in different local domains (cf. Pollard & Sag 1987: 73, 143–145; 1994: 23; Sag 2007: 403 a.o.).

    1. (20)

With respect to NCs, the question of locality arises regarding (i) their internal structure, i.e. the relation between N0 and its arguments, and (ii) the selection of NCs from outside, i.e. when the NC is selected by another element. Starting with the second point, assuming that subcategorisation is local in the sense just mentioned, Bruening (2009: 28–29) – building on Baltin (1989: 3–4) – points out that “verbs that select nominal arguments never select for particular determiners”. That is, as (21) suggests, if a verb allows the combination with an NC, it does not select properties of the determiner (e.g. indefinite, definite, or possessive). Verbs select properties of the noun, e.g. number or other s-selectional properties (cf. Chomsky et al. 2019: 22).18 For instance, a verb such as versammeln ‘to gather’ requires a complement denoting a plurality, either morphosyntactically realised (22a) or not (22c), but it does not impose constraints specifically on the determiner.19

    1. (21)
    1. Jacob ate {a / the / my } steak.
    1. (22)
    1. a.
    1.   er
    2.   he
    1. […]
    2.  
    1. versammelte
    2. gathered
    1. [seine
    2. his.PL
    1. Mönche]
    2. monks.PL
    1. um
    2. around
    1. sich
    2. him
    1. (DECOW)
    2.  
    1.   ‘He gathered his monks around himself.’
    1.  
    1. b.
    1. *Er
    2.   he
    1. […]
    2.  
    1. versammelte
    2. gathered
    1. [seinen
    2. his.SG
    1. Mönch]
    2. monk.SG
    1. um
    2. around
    1. sich.
    2. himself
    1.  
    1. c.
    1.   […]
    2.  
    1. er
    2. he
    1. [die
    2. the.SG
    1. Familie]
    2. family.SG
    1. versammelt
    2. gathered
    1. (DECOW)
    2.  
    1.   ‘He gathered the family.’

In this respect, NCs and sentences are different, and this is particularly important for analyses based on a CP/DP parallel (cf. Szabolcsi 1994: 198). Verbs selecting clausal complements select a specific complementiser, i.e. properties of C, as the following examples show (slightly modified from Baltin 1989: 3), while verbs selecting NCs do not select for properties of D, but of N (cf. (22)).

    1. (23)
    1. a.
    1.   John declared [CPthat Sally was insane].          (Baltin 1989: 3)
    1.  
    1. b.
    1. *John declared [CPfor Sally to be insane].
    1.  
    1. c.
    1. *John was waiting [CPthat Sally left].
    1.  
    1. d.
    1.   John was waiting [CPfor Sally to leave].

Returning to the first point, it is generally assumed that theta role and case are assigned in a local relation between head and arguments (cf. Sag 2012: 149; Chomsky et al. 2019: 3; a.o.). In German, event nominalisations, e.g. Behandlung ‘treatment’ in (24), inherit the argument structure of their base verbs, e.g. behandeln ‘treat’ (cf. Section 5.1). That is, the arguments of a verb can be realised in the phrase headed by its nominalisation as well. The nominal arguments which are realised with structural case (cf. Section 5.2) can be prenominal or postnominal (e.g. Jacobs and des Patienten in (24)). If only one argument is realised, it can be interpreted as agent or patient of the N head (24b)–(24e). If both are realised, there is a fixed phrasal order: the external argument (agent in (24a)) is realised as a PreGen, the internal one (patient in (24a)) as a PostGen. Furthermore, the interpretation of PreGens and PostGens depends on the head noun, e.g. Jacob is interpreted as experiencer in (26).

    1. (24)
    1. a.
    1. [Jacobs]AG
    2.   Jacob.GEN
    1. Behandlung
    2. treatment
    1. [des
    2.   the.GEN
    1. Patienten]PAT
    2. patient.GEN
    1.  
    1. b.
    1. [Jacobs]AG
    2.   Jacob.GEN
    1. Behandlung
    2. treatment
    1.  
    1. c.
    1. [des
    2.   the.GEN
    1. Patienten]PAT
    2. patient.GEN
    1. Behandlung
    2. treatment
    1.  
    1. d.
    1. die
    2. the
    1. Behandlung
    2. treatment
    1. [Jacobs]AG
    2. Jacob.GEN
    1.  
    1. e.
    1. die
    2. the
    1. Behandlung
    2. treatment
    1. [des
    2.   the.GEN
    1. Patienten]PAT
    2. patient.GEN
    1.  
    1. f.
    1. die
    2. the
    1. Behandlung
    2. treatment
    1. (25)
    1. *[des
    2.     the.GEN
    1. Patienten]PAT
    2. patient.GEN
    1. Behandlung
    2. treatment
    1. [Jacobs]AG
    2. Jacob.GEN
    1.   Intended: ‘Jacob’s treatment of the patient’
    1. (26)
    1.   [Jacobs]EXP
    2.     Jacob.GEN
    1. Begeisterung
    2. exaltation
    1.   ‘Jacob’s exaltation’

Comparing (17) and (20), it is clear that the PreGen (corresponding to YP) is not in a local relation with the noun (corresponding to Z0).20 Since theta roles should be determined by N, analyses such as in (12) or (17) need a solution for having the external argument in a local relation with D, and not with N. Thus, these analyses have three possibilities to deal with the external argument’s theta role: (i) it is assigned by the determiner, (ii) it is assigned by the noun locally and then this argument is moved (cf. (18)), or (iii) it is assigned non-locally.

In (i), the external argument is assumed to be base generated as specifier of DP, case and theta role being assigned by the D head. Since D assigns the theta role, it is assumed to be a quite general “possessive” role (cf. Olsen 1991: 48; Hartmann & Zimmermann 2003: 181; a.o.).21 But this analysis neglects the different interpretations of PreGens according to N heads, e.g. (24b) vs. (26), and not (25). In a similar manner, Kratzer (1996: 126–128) assumes that the external argument is not an argument of the N head at all and “[t]he genitive […] expresses a general notion of relatedness of which the agent relation is but a special case”.22 The proposal in (ii) (cf. (18)) would assume that the external argument is base generated inside NP (or nP, the extended projection of NP) as its specifier and moved to the specifier of DP, e.g. to get assigned case (cf. Radford 2000: 8–9; Alexiadou et al. 2007: 560–563; Sternefeld 2009: 587–589). This proposal would reflect the IP/DP parallel since the same movement (due to case assignment) is assumed for subjects from vP/VP to IP,23 but since IPs are not assumed in German (cf. Section 2), the parallel becomes void. The last proposal (iii) is ruled out for theoretical and empirical reasons since locality is a prominent characteristic of natural language (cf. Rizzi 1990: ix; Sag 2007: 395–397; a.o.). In a non-local approach, the head noun would have to determine properties, e.g. theta role, of the specifier of its selector D0 (cf. (20)). Different accounts have been proposed to deal with locality problems of this kind. Most of them redefine/extend the locality domain of lexical heads (following Grimshaw 1991; see also the LFG concept of cohead in Bresnan et al. 2016: 51, 105). Their locality domain is assumed to be not only their own phrasal level, but the phrasal level of the functional projections belonging to this lexical head (e.g. nP, NumP, and DP for N). The lexical and functional head(s) share categorial features, e.g. nominal for N and D, projecting this feature to the maximal level. But as pointed out in Riemsdijk (1998: 1–7), we still have different maximal projections, and therefore also different locality domains. He proposes different types of heads: lexical, semi-lexical, and functional. These elements combine specifying different aspects of the structure (e.g. lexical vs. functional), leading to only one maximal projection. In this type of approach, the notion of head is understood as “head w.r.t. features it specifies”, reminiscent of the notion of relativized head in Di Sciullo & Williams (1988: 25–27) for morphological purposes. Hence, in Riemsdijk (1998: 36), an NC is not only the projection of a lexical, but also of functional heads. The cost of this approach is thus several (relativized) heads inside one phrase.

This difficulty w.r.t. locality domain is one of the reasons for some MGG approaches to reconsider the nP/NP analysis, since “thematic relations are established in a strictly local fashion” (Chomsky et al. 2019: 3). For instance, in Chomsky (2007: 25–26); Bruening (2009: 33); and Chomsky et al. (2019: 22), it is assumed that the head of NC is not D but n, the extended projection of N.24 It is important to take into account that n in these accounts is still a functional head taking the NP as complement, and therefore not completely compatible with the locality definition proposed above. In the remainder of this paper, we will first explain the basic concepts of the framework used (Section 4) to achieve a local NP analysis (Section 5) that accounts for the constituent order regularities with the corresponding case and theta role assignments shown in (24).

4 Basic concepts of HPSG

Our analysis of NCs is conducted in HPSG (cf. Pollard & Sag 1987; 1994). In contrast to the MGG analyses just discussed, HPSG is a model-theoretic approach (cf. Pullum & Scholz 2001; Richter 2007), belonging to the so-called constraint-based frameworks. In this spirit, linguistic theories are sets of constraints – formalised as attribute-value pairs – stating the necessary conditions on the structure of individual expressions (cf. Pullum & Scholz 2001: 19; Müller & Machicao y Priemer 2019: Section 4). HPSG models descriptions of linguistic objects (e.g. words, phrases, rules, etc.) by means of feature descriptions, which are formalised as attribute value matrices (AVM). AVMs consist of pairs of attributes (in small caps) and corresponding values of a certain type (in italics). For instance, in (27) the attribute NUMBER (NUM) has the value singular (sg). Values can be atomic or complex, the value of NUMBER in (27) is atomic (sg), while the value of INDEX (e.g. for the word man) in (28) is of type referential index (abbreviated as ref)25 and it is complex, i.e. the value itself is a feature description.

    1. (27)
    1. [NUM sg]
    1. (28)

Every value, atomic or complex, and thus every feature structure, is of a certain type. Types are hierarchically ordered in the grammar from the most general types at the top to the most specific ones at the bottom. For instance, (29) shows the type hierarchy for number with its two subtypes singular and plural. While describing linguistic objects, we can use underspecified types to express generalisations. For example, it is possible to give a description of the linguistic object the as having the most unspecific value (i.e. number) for the attribute NUM, since it can be used in singular as well as in plural contexts. But in a particular utterance, the values must be maximally specific, e.g. sg in the child and pl in the children. The NUM value will be specified by means of an agreement constraint which states that the NUM values of a determiner and its corresponding noun must agree.

    1. (29)

HPSG is a lexicalist framework, i.e. most of the relevant linguistic information is located in the lexicon, simplifying in that way the phrase structural component. Linguistic objects of all kinds (e.g. words, phrases, rules, etc.) are treated as signs in the spirit of Saussure (1916), i.e. form and meaning are always represented conjoined26 and always accessible. This is a major difference between HPSG and MGG. In the latter, syntax, semantics, and phonology are represented in different modules accessible only at different points of the derivation (cf. Chomsky 1995: 20–23; Richards 2015: 812; 830). In HPSG, the descriptions of linguistic objects are systematically stored in the (abstract) lexicon using the same formal mechanisms of description (i.e. feature descriptions) for all of them (cf. Müller 2019: 205–210). The organisation of the lexicon depicts the generalisations of all kinds of linguistic objects due to a type hierarchy reflecting different levels of abstraction (cf. Müller & Machicao y Priemer 2019: 330–331).

HPSG is surface-oriented and declarative, i.e. non-derivational. That is, structures are not derived from underlying structures, e.g. a passive from an active clause. Structures are analysed directly involving their observable elements in the positions they are. In general, concepts such as “movement” and “empty element” are avoided.27 The descriptions of linguistic objects given by HPSG focus on what can be observed without presupposing a universal syntactic structure connecting structural positions to grammatical functions. That is, the workflow in HPSG consists of (i) giving an adequate description of a phenomenon, (ii) formalising its generalisation, and (iii) – if possible – matching the generalisations in different languages arriving at a universal core of constraints (cf. Sag & Wasow 2011: 372; Müller 2014; 2015a). Hence, HPSG analyses do not resort to syntactic structures based on evidence of other languages than the one at issue.

As already mentioned, HPSG is a lexicalist framework, where lexicalist can be understood as in the following quote:

In lexical (or lexicalist) approaches, words are phonological forms paired with valence structures (also called predicate argument structures). A word’s predicate argument structure contains descriptions of the argument phrases the word combines with, and specifies the meaning of the combination as a function of the meanings of the parts. Lexical rules grammatically encode the systematic relations between cognate forms and diathesis alternations. Syntactic rules combine the words into larger units: sentences, NPs, APs, and so on. The syntactic combinatorial rules for endocentric structures are usually assumed to be very general and few in number.

  —(Müller & Wechsler 2014: 2).

Following this kind of approach, we can give a description of Behandlung ‘treatment’ as in (30).

    1. (30)

The AVM in (30) is the description of a stem (cf. type stem). It contains phonological (value of PHON(OLOGY)),28 and syntactic and semantic information (value of SYN(TAX)-SEM(ANTICS)). The value of SYNSEM is complex, it comprises different attributes (e.g. LOC(AL) and NON-LOC(AL) for local and non-local information).29 LOC is subdivided into syntactic information (in CAT(EGORY)) and semantic information (in CONT(ENT)). In CAT, there is an attribute called HEAD, whose value provides the information of the lexical head that is “projected”30 to the phrasal level by means of the Head Feature Principle (cf. Pollard & Sag 1994: 34). This information is particularly important for the distribution of the phrase, e.g. part of speech information.31ARG(UMENT)-ST(RUCTURE) is a further syntactic feature. Its value is a list of synsem elements, the arguments of the linguistic object. This list does not represent the elements that the described expression syntactically requires, but it links the semantic valency of the predicate (in CONT) with the syntactic valency of the expression (SPR and COMPS – explained in Section 5.1). The elements of the ARG-ST list are ordered following the Accessibility Hierarchy (Keenan & Comrie 1977). That is, their order reflects their prominence w.r.t. independent linguistic phenomena such as passive, relative clauses, case, binding, extraction, etc. (cf. Manning & Sag 1998: 111; Koenig 1999: 29; a.o.). The semantic information is represented as the value of CONT. The value of IND(EX) is a complex value bearing the properties of the referential variable of the expression (cf. (28)). The value of REL(ATION)S is a list of elementary predications describing the meaning of the expression (cf. Copestake et al. 2005: 283). In (30), the list has only one element: the AVM for a treat-relation (treat-rel). This (Davidsonian) relation is a complex value with three attribute-value pairs: ARG0 has the referential the semantic arguments of the predicate, AG(ENT) and PAT(IENT) with their respective values and , are given as well. The indexed boxes in (30) are values (atomic or complex) shared with other parts of the structure, i.e. they state the notation for token identity. This representation of identity between (sub-)structures is called structure sharing and constitutes one of the most important descriptive devices in HPSG and other declarative frameworks (cf. Bildhauer 2014: 528–529; Müller 2019: 211–213; a.o.). In (30), the ARG0 value of treat-rel and the IND value of the stem are structure shared (cf. ). Furthermore, by the means of structure sharing, (30) states that the first element of the ARG-ST list32 of Behandlung is interpreted as agent () and the second as patient ().

5 NP analysis in HPSG

In this section, we provide an analysis of NCs as NPs. In Section 5.1, we give a formalisation of the transfer of argument structure from verbal to nominal stems by means of lexical rules (LRs) reflecting the fact shown in (24), that event nominalisations are able to realise the arguments with structural case of their verbal counterparts. Since we are dealing with a lexicalist approach, the manipulation of argument structure must be addressed on a lexical level. In Section 5.2, we show how to account for the nominative-accusative to genitive case alternation from the verbal to the nominal domain. In addition, we outline a solution for the alternation between genitive NPs and von-PPs. In Section 5.3, we focus on the prenominal position. We discuss a parallel construction in German, the prenominal dative with possessive, that supports our analysis with an empty determiner for PreGens. Furthermore, we show that possessive elements (determiners, NPs in the genitive, von-PPs) interact with the argument structure of the noun being in complementary distribution with the external argument. This leads us to an analysis of possessives as arguments of the noun (cf. Barker 2012: 1114) accounting for the alternation by means of a LR. In Section 5.4, we show how the syntactic composition of the lexical elements explained in the previous sections leads us to an NP structure. In comparison to MGG approaches, functional properties of the structures are handled lexically, and not by means of phrasal structures. Therefore, we can account for the different types of constructions just presented respecting locality – as defined in Section 3.2 – and for the different interpretations of PreGens and PostGens without stipulating a quite general theta role for external arguments (cf. Kratzer 1996: 126–128). Our analysis is based on selectional localism (Sag 2012), i.e.

[f]or purposes of category selection (subcategorization), case assignment, (non-anaphoric) agreement, and semantic role assignment, a lexical head has access only to the signs it selects via some feature (e.g. ARG-ST [… ]), i.e. the elements that it is connected to via a grammatical relation [… ]          (Sag 2012: 149)

Therefore, the N head determines form and interpretation of the elements in its ARG-ST list, and the NC can be selected from outside according to the properties of the N head, not of D. Finally, in Section 5.5, we provide corpus evidence that the constructions with PreGens can be complex and recursive. Furthermore, we provide a semantic analysis accounting for the compositionality of these structures.

The data we account for have been discussed in (24). What has to be ruled out is a structure with a prenominal internal and a postnominal external argument (31a), as well as both arguments realised in prenominal (31b–c) or postnominal position (31d–e). Furthermore, for common nouns (e.g. Behandlung ‘treatment’), its realisation without a determiner must be ruled out (31f–h).

    1. (31)
    1. a.
    1. *[des
    2.     the.GEN
    1. Patienten]PAT
    2. patient.GEN
    1. Behandlung
    2. treatment
    1. [Jacobs]AG
    2.   Jacob.GEN
    1.  
    1. b.
    1. *[Jacobs]AG
    2.     Jacob.GEN
    1. [des
    2.   the.GEN
    1. Patienten]PAT
    2. patient.GEN
    1. Behandlung
    2. treatment
    1.  
    1. c.
    1. *[des
    2.     the.GEN
    1. Patienten]PAT
    2. patient.GEN
    1. [Jacobs]AG
    2.   Jacob.GEN
    1. Behandlung
    2. treatment
    1.  
    1. d.
    1. *die
    2.   the
    1. Behandlung
    2. treatment
    1. [Jacobs]AG
    2.   Jacob.GEN
    1. [des
    2.   the.GEN
    1. Patienten]PAT
    2. patient.GEN
    1.  
    1. e.
    1. *die
    2.   the
    1. Behandlung
    2. treatment
    1. [des
    2.   the.GEN
    1. Patienten]PAT
    2. patient.GEN
    1. [Jacobs]AG
    2.   Jacob.GEN
    1.  
    1. f.
    1. *Behandlung
    2.   treatment
    1.  
    1. g.
    1. *Behandlung
    2.   treatment
    1. [Jacobs]AG
    2.   Jacob.GEN
    1.  
    1. h.
    1. *Behandlung
    2.   treatment
    1. [des
    2.   the.GEN
    1. Patienten]PAT
    2. patient.GEN

5.1 Nominalisation and argument structure

As shown in (24), in event nominalisations the arguments (with structural case) of the verbal stem are preserved. The LR in (32) licenses the event nominalisation with the German affix -ung. It takes a linguistic object of type stem as input, e.g. behand(e)l- ‘treat’, and licenses a linguistic object of type ung-n-stem (a subtype of nominal stem, cf. (34)) as output, e.g. Behandlung ‘treatment’. LRs specify only aspects of input and output that are changed, everything else remains the same. For instance, in (32), the input specifies: the type of linguistic object (stem vs. ung-n-stem), its phonological form ( in the input, but plus phonological form of the affix -ung in the output), its HEAD value (verb vs. noun), and the ARG-ST list.33

    1. (32)
    1. LR: -ung nominalisation

The value of the ARG-ST list of the verbal input is divided into three lists. The append operator (i.e. ⊕) is used to combine lists into one single list. The actual value of ARG-ST is therefore the concatenation of the three lists , , and in the input (and of the two lists and in the output). The lists are ordered according to the types of elements they contain (cf. Przepiórkowski 1999: 18–19). The first list () contains NPs with structural case, the second one () NPs with lexical case, and the third one () PP arguments.34 The type list has two subtypes: e-list (i.e. empty-list) and ne-list (i.e. non-empty-list). That is, the LR (32) can handle verbs with one or more arguments of different types (structural, lexical, and PPs). For instance, an intransitive verbal stem (i.e. with one NC with structural case) such as genes- ‘convalesce’ (35) can be nominalised by this rule. In this case list(lex) and list(pp) are empty. A verbal stem such as verzeih- ‘forgive’ has two arguments with structural and one with lexical case (cf. (33a) from Schumacher et al. 2004: 813). According to (32), only the list(str) and the (empty) list(pp) are carried over to the nominal stem, but not the list(lex), as (33b)–(33d) show: the person who forgives (33b), as well as the thing to be forgiven (33c) can be realised, but not the person to whom something is forgiven (33d), neither with lexical nor with structural case.

    1. (33)
    1. a.
    1.   Kannst
    2.   can
    1. [du]
    2.   you.NOM
    1. [mir]
    2. me.DAT
    1. [meinen
    2.   my.ACC
    1. Wutausbruch]
    2. tantrum.ACC
    1. verzeihen?
    2. forgive
    1.   ‘Can you forgive (me) my outburst of anger?’
    1.  
    1. b.
    1.   die
    2.   the
    1. Verzeihung
    2. forgiveness
    1. des
    2. the.GEN
    1. Königspaares
    2. royal.couple.GEN
    1. (DECOW)
    2.  
    1.   ‘the royal couple’s forgiveness’
    1.  
    1. c.
    1.   die
    2.   the
    1. Verzeihung
    2. forgiveness
    1. der
    2. the.GEN
    1. Sünden
    2. sins.GEN
    1. (DECOW)
    2.  
    1.   ‘the forgiveness of the sins’
    1.  
    1. d.
    1. *die
    2.   the
    1. Verzeihung
    2. forgiveness
    1. {dem
    2.   the.DAT
    1. Täter
    2. offender.DAT
    1. /
    2.  
    1. des
    2. the.GEN
    1. Täters}
    2. offender.GEN
    1.   Intended: ‘the forgiveness of the offender’

As already mentioned, we are accounting for changes in the argument structure of lexical elements on the lexical level. One fact that can be observed in (24) is that arguments of N heads are optional35 (cf. Bierwisch 1989: 7; Ehrich & Rapp 2000: 275; a.o.). Since this seems to be a property of all nouns/nominalisations, but not (necessarily) of the underlying verb stems, the optionality of arguments can be modelled as a more general constraint on nouns.36

In HPSG, syntactic and semantic valency are represented separately. The semantic valency is represented as attribute-value pairs of a semantic relation, e.g. in (30) the treat-rel(ation) has an agent () and a patient () of an event (). The syntactic valency is represented as the value of two further syntactic attributes: SP(ECIFIE)R and COMP(LEMENT)S (cf. (34)). By means of syntactic combination (cf. Section 5.4), the elements of the SPR and COMPS lists are saturated. The link between syntactic and semantic arguments takes place via the ARG-ST list. Through constraints applying on stems, the elements of the ARG-ST list, which are mapped to the semantic arguments (see (30)), are mapped onto SPR and COMPS according to specific characteristics of the language (e.g. SOV vs. SVO language) or of a specific construction (cf. Manning & Sag 1998: 124–125; Davis & Koenig 2000: 67; Van Eynde 2015: 114–117; Machicao y Priemer & Fritz-Huechante 2018: 167).

(34)37 shows a part of the type hierarchy for stems. Type hierarchies in HPSG are representations of generalisations in the lexicon, they are conceived as inheritance hierarchies, i.e. constraints applying to supertypes apply to their subtypes as well, e.g. the HEAD value of nominal-stem (n-stem), i.e. noun, applies also to the subtypes event-n-stem (ev-n-stem), ung-n-stem, etc. Due to multiple inheritance, types (e.g. ung-n-stem-2) can inherit constraints from different supertypes (e.g. ung-n-stem and n-as-mapping-2).

    1. (34)

The hierarchy in (34) is divided into two subtypes part-of-speech-stems (pos-stem) and argument-structure-mapping (as-mapping). The former constrains the stems of different parts of speech, e.g. verbal stems and nominal stems (n-stem). The latter provides constraints for the mapping between the value of ARG-ST and the syntactic valency attributes (SPR and COMPS). The type ung-n-stem has two subtypes, which inherit constraints from three different kinds of argument-structure mappings.38 The LR (32) together with the as-mapping constraints licenses the adequate patterns given in (24) and rules out the ones in (31). By the means of structure sharing, the constraints determine which argument of the ARG-ST list is going to be realised as an element of COMPS or of SPR. It is important to stress out that the terms specifer and complement in HPSG are similar but not synonymous with their MGG counterparts. In HPSG, syntactic positions do not have a function per se. For instance, that a phrase is realised as the specifier of a head in HPSG does not imply that it is more “subject-like”. SPR and COMPS state something about the hierarchical position of a phrase relative to other phrases combined with the head (cf. Section 5.4). To which extent a phrase is to be interpreted more or less “subject-like” is reflected lexically, by its position in the ARG-ST list, since case assignment, linking with semantic arguments, binding, etc. refer to it.

In the as-mapping constraints in (34), the ARG-ST lists are divided into three lists in ung-n-stem-1 and into two lists in ung-n-stem-2. Depending on the specific value of list in ung-n-stem-1, the NP with structural case referred to (i.e. ) can be the first element of the ARG-ST list (if the first list is specified as e-list), or any other NP with structural case (if the first list is specified as ne-list). The constraint n-as-mapping-1 is represented as a disjunction of two possible mappings. The constraint at the top of the disjunction licenses PostGens, while the other one licenses PreGens. This reflects the fact that nouns allowing a PreGen allow a PostGen as well. To be more explicit, nominal stems of type ung-n-stem-1 (inheriting the constraint at the top of the disjunction) license structures with a determiner as specifier and one optional argument with structural case as complement. According to that, either the first element with structural case, i.e. the external argument, as in (24d), or the second one, i.e. the internal argument, as in (24e) can be licensed postnominally. Since the realisation of the element in the COMPS list is optional, also the combination of determiner and N head as in (24f) is licensed. Nominal stems inheriting the constraint at the bottom of the disjunction license structures with one argument with structural case as specifier and no argument as complement (cf. empty COMPS list). This constraint licenses structures having either the first (24b) or the second element with structural case (24c) in the prenominal position. On the other hand, nominal stems of type ung-n-stem-2 (inheriting n-as-mapping-2) license structures with one argument with structural case as specifier and one argument with structural case as complement. This constraint specifies that the first element of the arg-st list has to be realised as specifier, i.e. prenominally, while the second one has to be realised as a complement, i.e. postnominally, (24a).

Summarising, the LR (32) interacting with the argument-structure mapping constraints in (34) licenses the structures provided in (24) and, consequently, rules out the constructions in (31). Since the as-mapping constraints allow maximally one or zero arguments in the specifier position of the phrase, (31b) and (31c) are ruled out. Similarly, these constraints allow (maximally) one NP with structural case in COMPS. Thus, (31d) and (31e) are ruled out as well. Furthermore, (31a) is not allowed since ung-n-stem-2 licenses only structures with the first element of the ARG-ST list in prenominal and the second in postnominal position, but not the other way around. Finally, (31f), (31g), and (31h) are ruled out since all three constraints in (34) require the realisation of a determiner (DetP)39 as the specifier of the noun.

It is important to point out that the LR (32) as well as the as-mapping constraints in (34) license nominal stems not only from transitive verbal stems such as behand(e)l- ‘treat’, but also from intransitive ones such as genes- ‘convalesce’ (35), and ditransitives such as verzeih- ‘forgive’ (33). Only stems of type ung-n-stem-2 must have two arguments with structural case, since this is explicitly stated in the constraint n-as-mapping-2, all others can but they do not have to.

    1. (35)
    1. a.
    1. des
    2. the.GEN
    1. Schimmels
    2. white.horse.GEN
    1. Genesung
    2. convalescence
    1. (DECOW)
    2.  
    1. ‘the convalescence of the white horse’
    1.  
    1. b.
    1. eine
    2. a
    1. baldige
    2. soon
    1. Genesung
    2. convalescence
    1. Elisabeths
    2. Elisabeth.GEN
    1. (DECOW)
    2.  
    1. ‘a speedy recovery of Elisabeth’
    1.  
    1. c.
    1. Auschlaggebend
    2. decisive
    1. für
    2. for
    1. die
    2. the
    1. Genesung
    2. convalescence
    1. ist […]
    2. is
    1. (DECOW)
    2.  
    1. ‘It is decisive for the recovery […]’

5.2 Case assignment

In MGG, case assignment is related to specific positions in syntactic structure. For instance, while the external argument of a verb is base generated as specifier of VP (or vP), in order to get case assigned it moves to the specifier position of IP (or TP or AgrSP; cf. Adger 2003: 211–217; Alexiadou et al. 2007: 279–281). In contrast, the accusative of internal arguments is assigned inside the VP (or AgrOP, cf. Chomsky 1995: 149–150). Assuming the parallel between sentences and NCs, a similar case assignment reasoning must be applied. But the parallels between sentences and NCs have their limits. The external argument of the noun is base generated in the specifier of NP and moved to the specifier of DP in order to get case (cf. (18)). The internal argument of the noun receives case in its base position. Assuming that case and theta roles are assigned in specific syntactic positions is problematic for data such as (36) that shows the external argument in postnominal position.40

    1. (36)
    1. die
    2. the
    1. Diagnosen
    2. diagnoses
    1. und
    2. and
    1. Behandlungen
    2. treatments
    1. ihrer
    2. their.GEN
    1. Tierärzte
    2. vets.GEN
    1. (DECOW)
    2.  
    1. ‘their vets’ diagnoses and treatments’

For the PostGen ihrer Tierärzte to get the agent role, it needs to be base generated in the specifier of NP. Moving this phrase to the complement position would assign case to it, but also a further theta role (patient) being this the position for the internal argument. Moving this phrase to the specifier of DP would assign case to the phrase, but leading to the wrong linearisation. Therefore, to analyse this kind of constructions (cf. too (35b) vs. (35a), and (24d) vs. (24e)) further functional projections as landing positions for N head and determiner would be needed in such a way as to enable them to precede the external argument (cf. Sternefeld 2009: 587–589).

In HPSG, case assignment is not related to specific positions in a syntactic structure (i.e. SPR and COMPS), but is done lexically. Case is specified in the lexical items of linguistic objects (cf. (39), (40)) and handled by a general principle (38). Thus, movement and further functional phrases (e.g. nP, DP, AgrP, etc.) for case assignment are not used. The verb stem behandel- ‘to treat’, from which the nominal stem Behandlung ‘treatment’ is derived, has two arguments with (unspecified) structural case in its ARG-ST list. It is commonly known that case is sensitive to syntactic contexts, e.g. in an active sentence (37a) the external argument of a verb is realised in the nominative and its internal argument with accusative, while in a passive sentence (37b) the internal argument is realised in the nominative.

    1. (37)
    1. a.
    1. Jacob
    2. Jacob.NOM
    1. behandelt
    2. treats
    1. den
    2. the.ACC
    1. Patienten.
    2. patient.ACC
    1. ‘Jacob treats the patient.’
    1.  
    1. b.
    1. Der
    2. the.NOM
    1. Patient
    2. patient.NOM
    1. wird
    2. is
    1. behandelt.
    2. treated
    1. ‘The patient is treated.’

By means of a passivisation LR, a new lexical item without subject in its ARG-ST list is licensed,41 hence only one argument with structural case remains, yielding the case assignment in (37b) according to the Case Principle in (38) (cf. Müller 2003, 2019: 285–288). Through nominalisation of verb stems, the list of structural arguments ( in (32)) remains unaffected. The difference between verbal and nominal stem in German is that structural case is realised as nominative or accusative in the verbal, but as genitive in the nominal domain (cf. Chomsky 1981: 170; Haider 1985: 80–81; Machicao y Priemer 2017: 124–136). In HPSG, the Case Principle (cf. Meurers 1999: 204; Przepiórkowski 1999: 79–80) takes care of the appropriate case assignment in German and many other languages lexically, making use of the information in the ARG-ST list of the lexical item yielding a local solution.

(38) Case Principle (simplified)42
 
  • In the verbal domain (cf. (39)), the first element with structural case in the ARG-ST list receives nominative, all further elements in the list with structural case receive accusative.
  • In the nominal domain (cf. (40)), elements with structural case in the ARG-ST list receive genitive.

    1. (39)
    1. (40)

Furthermore, German NC arguments of N heads can be marked either with the genitive or with the preposition von.43 That the preposition von can be used with external (41a) or internal (41b) arguments indicates that von is a semantically vacuous preposition and that it is not assigning the theta roles, but the N head itself is. When marked with the preposition, arguments are realised postnominally, as it is usual for PPs inside NCs. Since this is not the topic of this paper, we are not giving a full analysis of the genitive-vs.-von variation. But the analysis for this phenomenon can be made straightforward with the means just shown, i.e. with a LR. This LR would take one element of the list of structural arguments (i.e. list(str)) of the nominal stem (cf. output of LR (32) and puts it into its list of PP arguments (i.e. list(pp)). This LR can be applied recursively (41c) until there are no more structural arguments left.

    1. (41)
    1. a.
    1. unter
    2. under
    1. der
    2. the
    1. Führung
    2. guidance
    1. [von
    2.   of
    1. Crazy
    2. Crazy
    1. Boy
    2. Boy
    1. Henderson]AG
    2. Henderson
    1. (DECOW)
    2.  
    1.  
    1. b.
    1. die
    2. the
    1. Verwaltung
    2. administration
    1. und
    2. and
    1. Führung
    2. management
    1. [von
    2.   of
    1. Fondsdepots]PAT
    2. fund.deposits
    1. (DECOW)
    2.  
    1.  
    1. c.
    1. Übertragung
    2. transmission
    1. [von
    2.   of
    1. HIV]
    2. HIV
    1. [von
    2.   from
    1. der
    2. the
    1. Mutter]
    2. mother
    1. auf
    2. to
    1. das
    2. the
    1. Kind
    2. child
    1. (DECOW)
    2.  

5.3 The specifier of NP

As it can be concluded from the LR (32) and the as-mapping constraints (34), we propose that PreGens are (to some extent) inside the DetP in specifier position (cf. (42)).44 There are different ways to account for the patterns in (24) in an NP analysis. One possibility is to have a unary syntactic rule converting PreGens into determiners. Another possibility is to propose an empty determiner taking the PreGen as argument. We choose the latter analysis, since it is supported by language internal data (cf. Section 5.3.1), and requires fewer additional assumptions needed only for this construction such as an idiosyncratic rule to convert a complex phrase into a determiner (but cf. Footnote 70).

In the remainder of this section, we will first introduce a parallel structure to the PreGens (cf. Section 5.3.1), namely prenominal datives (PreDat), which gives us evidence for the structure proposed. Then we will turn to possessive constructions (cf. Section 5.3.2) and show that they can be analysed by the same means presented here, allowing them to be interpreted as arguments of the head noun. Finally (Section 5.3.3), we provide the lexical items for the different types of determiners which are combined with the N head and make it possible to analyse PreGens, PreDats and possessives with a unified account.

    1. (42)

5.3.1 A parallel structure: The prenominal dative

It is commonly assumed that (at least) singular common nouns need a determiner in order to constitute a complete NC (cf. (31f–h)). The as-mapping constraints (34) are taking this fact into account adding a determiner to the syntactic valency of the noun (cf. DetP in SPR), therefore being in line with general assumptions about the structure of NCs. The question is: can determiners host an NC and if so, is there any language internal evidence for that? In some varieties of German (e.g. Alemannic and Swabian), there is a construction with an NC in the dative preceding a possessive determiner and its N head (43a) (cf. Demske 2001: Section 4.3.4; Zifonun 2003: 102; Karnowski & Pafel 2004: 181–184; Sternefeld 2015: 220–221). As (43b) shows, the NC in the dative (dem Fischer ‘the fisher’) cannot follow the N head (Frau ‘wife’), it can only precede the possessive determiner (seine ‘his’).

    1. (43)
    1. a.
    1.   Das
    2.   this
    1. ist
    2. is
    1. [dem
    2.   the.DAT
    1. Fischer
    2. fisher.DAT
    1. seine
    2. his.NOM
    1. Frau].
    2. wife.NOM
    1. (Sternefeld 2015: 221)
    2.  
    1.   ‘This is the fisher’s wife.’
    1.  
    1. b.
    1. *Das
    2.   this
    1. ist
    2. is
    1. [seine
    2.   his.NOM
    1. Frau
    2. wife.NOM
    1. dem
    2. the.DAT
    1. Fischer].
    2. fisher.DAT
    1.   Intended: ‘This is the fisher’s wife.’

The possessive determiner (seine) in (43a) agrees with the N head (Frau) in case, number, and gender. This construction can appear in the preverbal position in declarative sentences (44) and it cannot be divided (45), hence this complex structure behaves as one constituent (cf. Karnowski & Pafel 2004: 181; Machicao y Priemer 2018a).45 Furthermore, the whole construction can be the complement of a preposition, but only the possessive determiner and the head noun get case assigned from it. In (46), the preposition auf ‘to’ assigns accusative to seinen Tipp ‘his hint’.

    1. (44)
    1. [Klaus
    2.   Klaus.DAT
    1. sein
    2. his.NOM
    1. Händler]
    2. dealer.NOM
    1. hat
    2. has
    1. auch
    2. too
    1. noch
    2. still
    1. ein
    2. a
    1. paar.
    2. pair
    1. (DECOW)
    2.  
    1. ‘Klaus’ dealer also has some.’
    1. (45)
    1. a.
    1. *Klaus
    2.   Klaus.DAT
    1. hat
    2. has
    1. sein
    2. his.NOM
    1. Händler
    2. dealer.NOM
    1. auch
    2. too
    1. noch
    2. still
    1. ein
    2. a
    1. paar.
    2. pair
    1.  
    1. b.
    1. *Sein
    2.   his.NOM
    1. Händler
    2. dealer.NOM
    1. hat
    2. has
    1. Klaus
    2. Klaus.DAT
    1. auch
    2. too
    1. noch
    2. still
    1. ein
    2. a
    1. paar.
    2. pair
    1. (46)
    1. Haben
    2. have
    1. dann
    2. then
    1. [PP auf
    2.   to
    1. [Fabi
    2.   Fabi.DAT
    1. seinen
    2. his.ACC
    1. Tipp]]
    2. hint.ACC
    1. gehört [… ]
    2. heard
    1. (DECOW)
    2.  
    1. ‘We then listened to Fabi’s hint.’

PreGens and PreDats show: (i) German DetPs can be complex, i.e. with an NC in predeterminer position; and (ii) determiners can constrain elements in their specifier. That is, (i) the possessive determiner in (43a) requires an (optional) dative NC46 as its specifier, while (ii) the empty determiner for PreGens (42) requires an obligatory genitive NC.

5.3.2 Possessives

When a head noun is combined with a possessive element (determiner (47a), NP (47b), or PP (47c)), there is a relationship between the discourse referent introduced by the possessive and the one of the head noun. We will show that the N head has to be interpreted as relational (cf. type shifting in Barker 2012: 1114), although in some cases this relation can be rather underspecified (cf. Szabolcsi 1994: 193).47

    1. (47)
    1. a.
    1. sein
    2. his
    1. Haus
    2. house
    1.  
    1. b.
    1. das
    2. the
    1. Haus
    2. house
    1. Jacobs
    2. Jacob.GEN
    1.  
    1. c.
    1. das
    2. the
    1. Haus
    2. house
    1. von
    2. of
    1. Jacob
    2. Jacob

Interestingly, possessives behave like arguments. They cannot be iterated (48a) in the same way as other thematic arguments can’t (48b). Moreover, the variation between genitive and von-PP (cf. Section 5.2) in German applies also to possessives (47b)–(47c). That is, the sketched LR for this alternation can apply to possessives as well if they are included in the ARG-ST list – a further indication of their argumental status. Furthermore, when the N head is inherently relational (e.g. ung-nominalisation), a possessive determiner can be interpreted as its argument (49a).

    1. (48)
    1. a.
    1. *seinPOSS
    2.   his
    1. Haus
    2. house
    1. [von
    2.   of
    1. dem
    2. the
    1. Vater]POSS
    2. father
    1.  
    1. b.
    1. *MariosEXP
    2.   Mario.GEN
    1. Genesung
    2. convalescence
    1. [von
    2.   of
    1. Peter]EXP
    2. Peter
    1. (49)
    1. a.
    1. seinePOSS/AG/PAT
    2. his
    1. Behandlung
    2. treatment
    1.  
    1. b.
    1. seinePOSS/AG/*PAT
    2. his
    1. Behandlung
    2. treatment
    1. Jacobs*POSS/*AG/PAT
    2. Jacob.GEN

Also w.r.t. constituent order, possessives show the same restrictions as arguments, with one exception: the possessor and the highest argument of a nominal head cannot co-occur. For instance, given the context in (50), i.e. interpreting Tim as patient, Peter as possessor (of the day care), and Bernd as agent (of the day care), it is possible to have the agent or the possessor preceding the patient (50a). But the patient can neither precede the agent nor the possessor (cf. (50b) and (49b)). What is more possessor and agent cannot be realised simultaneously (50c)–(50d).

    1. (50)
    1. Context: Peter is the father of Tim, and Bernd is the babysitter of the child.
    1.  
    1. a.
    1.   {BerndsAG
    2.     Bernd.GEN
    1. /
    2.  
    1. PetersPOSS}
    2. Peter.GEN
    1. Betreuung
    2. day.care
    1. des
    2. the.GEN
    1. KindesPAT
    2. child.GEN
    1.  
    1. b.
    1. *TimsPAT
    2.   Tim.GEN
    1. Betreuung
    2. day.care
    1. {des
    2. the.GEN
    1. BabysittersAG
    2. babysitter.GEN
    1. /
    2.  
    1. des
    2. the.GEN
    1. VatersPOSS}
    2. father.GEN
    1.  
    1. c.
    1. *PetersPOSS
    2.   Peter.GEN
    1. Betreuung
    2. day.care
    1. des
    2. the.GEN
    1. BabysittersAG
    2. babysitter.GEN
    1.  
    1. d.
    1. *BerndsAG
    2.   Bernd.GEN
    1. Betreuung
    2. day.care
    1. des
    2. the.GEN
    1. VatersPOSS
    2. father.GEN

These examples show that (i) the possessor behaves as the highest argument of the N head,48 and (ii) since they cannot co-occur, the highest argument must be (syntactically) suppressed when the possessor is realised. We analyse this change in the argument structure by means of LR (51). The input of (51) is a nominal stem. The first structural argument of the input’s ARG-ST list is a semantic argument of the relation denoted by the noun (). In the output of the rule, the relation denoted by the noun (cf. ) is also available, but the first structural argument of the ARG-ST list is now different: it is the possessor (cf. ) of the denotation of the noun (cf. ). Take into account that the agent is not being deleted from the semantics (i.e. it is semantically implied; cf. Footnote 41), but it is not connected to the ARG-ST list, hence not being captured by the as-mapping constraints provided in (34).

    1. (51)
    1. LR: possessor alternation

5.3.3 Determiners

A determiner’s task is to determine a noun. It specifies some characteristics of a noun that are needed for syntactic and semantic purposes. For instance, it is generally assumed that common nouns are semantically predicates (of type ⟨e,t⟩) and that determiners convert them into entities (of type e) (cf. Barwise & Cooper 1981: 161–166; Heim & Kratzer 1998: 73–75; a.o.). The syntactic translation of this semantic fact in MGG approaches is that a common noun projects an NP (type ⟨e,t⟩) which is the complement of a determiner projecting a DP (type e). Building on that, it has been proposed that verbs selecting nominal arguments do not select NPs but DPs, as (52) suggests (cf. Longobardi 1994: 612–613; Chierchia 1998: 342; Adger 2003: 253; a.o.).49

    1. (52)
    1. a.
    1. *I bought [NP car].
    1.  
    1. b.
    1.   I bought [DP the [NP car]].

An NP approach has to deal with the semantic and the syntactic fact. As pointed out in Bruening (2009: 31) “[i]t is generally accepted that semantic function-argument relations do not have to match syntactic head-complement/specifier relations”, therefore, it is not necessary to assume that D is the syntactic head of the phrase. In standard HPSG approaches (cf. Pollard & Sag 1994: Section 9.4), a linguistic object such as car in (52a) is not a fully saturated NP, since at least singular common nouns select (syntactically) for a determiner, that being the reason why the verb cannot be combined with it.50 On the other hand, there must be a mechanism for the determiner to determine the noun, i.e. to convert the noun from a predicate into an entity. Here, we use the standard HPSG assumption that the relation between determiner and noun is one of mutual selection (Pollard & Sag 1994: 50). The noun selects the SYNSEM value of a determiner through its SPR attribute, while the determiner selects the SYNSEM value of the non-saturated nominal it is being combined with through its SPEC(IFIED) feature.

In (53), the lexical entry of the German feminine definite determiner die is provided. It is a syntactically fully saturated element (i.e. SPR and COMPS lists are empty), its head value is det(erminer), defined as having a SPEC attribute whose value is a non-saturated nominal object (i.e. Nʹ). This Nʹ is the nominal object the determiner is going to be combined with and has an IND value 1 (its semantic content), which is structure shared with the ARG0 value of the def(initeness)-rel(ation) of the determiner, leading to a definite interpretation of the NC.

    1. (53)

In contrast, the lexical entry for the empty determiner (55) has no phonological contribution, similar to the lexical entry for traces (cf. Pollard & Sag 1994: 161; Müller 2019: 291; a.o.). The empty determiner selects some synsem element (i.e. the PreGen) through its SPR attribute,51 and – as any other determiner – it selects the N head through its SPEC feature. In order for the PreGen to be realised in DetP, but being accessible for the N head, we present an analysis based on the feature called EXTERNAL ARGUMENT (XARG) first proposed in Sag & Pollard (1991: 89)52 for phenomena such as control and further developed in Sag (2007: 408–410) to deal with locality issues within phenomena such as idiomatic expressions (54) in which the synsem of the non-head daughter (e.g. his) must be visible at phrasal level to be constrained from the outside (e.g. for co-reference).53

(54) Hei lost [hisi/*heri marbles]. (Sag 2007: 408)

Following Sag (2007: 403),

a construction cannot have direct access to properties of a mother and its granddaughters. If we observe that there is some such dependency, then we must provide an analysis in terms of some property of the granddaughter that is systematically encoded on the daughter, and hence rendered locally accessible at the higher level.

This is the task of the XARG attribute. It encodes systematically that the synsem of the PreGen is locally accessible at the level of the DetP (cf. (58)). The value of XARG is a synsem object. In contrast to the elements of SPR and COMPS, its value is not cancellable through saturation (Sag 2007: 410). Although the name XARG is reminiscent of the external argument used in MGG approaches, the concept behind it is not the same. XARG projects information of a non-head daughter to the phrasal level such that it is available for selection, e.g. theta role and case assignment. In the lexical entry of the empty determiner (55), it is specified that the value of SPR and XARG are structure shared (). In comparison, the XARG value of the definite determiner (53) is none, i.e. nothing is being projected to the DetP level.

    1. (55)

Take into account, that the as-mapping constraints in (34) determine that a noun taking a PreGen has a DetP with a synsem as XARG value, which is also interpreted as one of the arguments of the noun. In addition, the empty determiner is selecting a synsem as specifier, whose value is structure shared with its XARG value. Therefore, when empty determiner and noun are combined, the specifier of the determiner ( in (55)) will have the same SYNSEM value as the element in the noun’s ARG-ST list, i.e. an NP with structural case with a specific theta role.54

What about possessive determiners? They share similarities with definite determiners, but also with PreGens. Similar to definite determiners, they do not select elements through the valency lists and they mark the noun as definite (cf. SPR and def-rel in (56)). But similar to PreGens, they can be interpreted as arguments of the noun (49). Therefore, the possessive determiner must be able to bear a theta role assigned by the N head. In contrast to PreGens, they do not bear genitive, otherwise determiner and noun would not agree in case. This behaviour of possessive determiners can be accounted for with our analysis. In the lexical entry (56), the possessive takes an NP as its XARG value. The INDEX value of this NP and of the determiner are structure shared (cf. ), it is an entity in 3rd person, singular, masculine or neuter, i.e. the INDEX value of seine ‘his’.55

    1. (56)

This analysis reflects the double life of possessive determiners. On the one hand, they are functional elements determining the noun. On the other hand, they behave like nominal elements, introducing a discourse referent that can bear a theta role. Karnowski & Pafel (2004: 184) have therefore analysed them as D heads taking an empty pronominal DP, in our approach this information is in the lexical entry.56

5.4 Dominance schemata

To combine linguistic objects on a syntactic level, HPSG makes use of phrasal schemata (cf. Pollard & Sag 1994: 402–403; Sag 1997: 478–479; Müller & Machicao y Priemer 2019: 331–333). Behandlung ‘treatment’ is combined with its complement des Patienten ‘of the patient’ through the following schema (cf. also (58)):

    1. (57)

This schema, for all combinations of heads with complements, states that if a linguistic object (N′ in (58)) is of type head-complement-phrase, then the SYNSEM value of the non-head daughter (NP des Patienten in (58)) matches the constraints imposed by the first element of the COMPS list of the head daughter (N0 in (58)). The COMPS value of the resulting head-complement phrase (N′ in (58)) is the COMPS value of the head daughter without the SYNSEM value of the non-head daughter, the empty list in (58) since the head does not have further complements.

    1. (58)

The combination of a head with its specifier (in our case Jacobs or die ‘the’ with Behandlung, or a subject with a verb in an SVO language) is constrained by the schema (59). It states that if a linguistic object (NP in (58)) is of type head-specifier-phrase, then the SYNSEM value of the non-head daughter (DetP in (58)) is token identical with the last element of the SPR list of the head daughter (N′ in (58)).57 The SPR value of the resulting linguistic object is the SPR value of the head daughter without the SYNSEM value of the non-head daughter; the empty list in (58) since the head does not have further specifiers.

    1. (59)

The combination of the empty determiner58 with the NP Jacobs is licensed by the head-specifier-phrase schema (59) as well. The empty determiner itself further constrains that the SYNSEM value of its specifier is stored in XARG (cf. value of D0’s XARG in (58)). The value of XARG is projected onto the phrasal level to be visible for local selection (cf. value of DetP’s XARG in (58)). The argument mappings provided in (34) ensure that the specifier of the head noun has the expected argument as value of its own XARG. In this way, the theta role assigned by the head noun reaches the PreGen.

5.5 Compositionality, complexity, and recursion

As mentioned in Section 3.1, the possibility of having complex PreGens in German is a controversial topic in the literature. Complex PreGens are accepted in some analyses (cf. Vater 1991: 23; Müller 1999: 59–60; Sternefeld 2015: 212; Machicao y Priemer 2017: 233–234; Verhoeven & Lehmann 2018: 10), and rejected in others (cf. Bhatt 1990: 114–115; Olsen 1991: 48–49; Hartmann & Zimmermann 2003; Karnowski & Pafel 2004: 183), but as shown in (19), they can be found in corpora. Complex recursive structures in the German prenominal position are even more contentious (cf. Hartmann & Zimmermann 2003: 182; Roeper & Snyder 2005: 164; Kobele & Zimmermann 2012: 229; Andrews 2017: 2; Chomsky et al. 2019: 4). However, such examples can be found in corpora as well (60). Normally, recursive PreGens are left-branching (60a–c), but it is also possible to find examples like (60d) in which the N head of the PreGen Sohnes ‘son’s’ has a PostGen Gottes ‘god’s’. It is also worth mentioning that the PreDat structure shown in Section 5.3.1 can be recursive as well, as (60e) from a slogan for federal elections in Germany discussed in Karnowski & Pafel (2004: 181) and Zifonun (2003: 100) shows.

    1. (60)
    1. a.
    1. Ihres
    2. your
    1. Vaters
    2. father.GEN
    1. Vaters
    2. father.GEN
    1. Schwester
    2. sister.GEN
    1. Mann
    2. man
    1. (DECOW)
    2.  
    1. ‘your father’s father’s sister’s husband’
    1.  
    1. b.
    1. Einschüsse
    2. bullet.holes
    1. auf
    2. on
    1. [Peters
    2.   Peter.GEN
    1. Bruders
    2. brother.GEN
    1. Harley]59
    2. Harley
    1. ‘bullet holes on Peter’s brother’s Harley’
    1.  
    1. c.
    1. mit
    2. with
    1. [des
    2.   the.GEN
    1. Vaters
    2. father.GEN
    1. Bruders
    2. brother.GEN
    1. Witwe]
    2. widow.DAT
    1. (DECOW)
    2.  
    1. ‘with the father’s brother’s widow’
    1.  
    1. d.
    1. Maria
    2. Maria
    1. ist
    2. is
    1. [des
    2.   the.GEN
    1. Sohnes
    2. son.GEN
    1. Gottes
    2. God.GEN
    1. Mutter].
    2. mother
    1. (DECOW)
    2.  
    1. ‘Maria is god’s son’s mother.’
    1.  
    1. e.
    1. Ich
    2. I
    1. wähl
    2. vote
    1. [Doris
    2.   Doris.DAT
    1. ihrem
    2. her.DAT
    1. Mann
    2. husband.DAT
    1. seine
    2. seine.ACC
    1. Partei].
    2. Partei.ACC
    1. ‘I am voting for Doris’ husband’s party.’

Given that corpus data can be presented and that the constructions are (difficult but) interpretable, we assume that complex as well as recursive PreGens are possible in German (cf. too Haider 1988: 56; Machicao y Priemer 2017: 239; Verhoeven & Lehmann 2018: 10). Nevertheless, several factors determine the lower frequency of complex PreGens in contrast to PostGens and other types of structures. First, as shown in Karlsson (2007: 114–116); Verhoeven & Lehmann (2018: 4); a.o. some types of (self-)embedding structures are generally less preferred than others across phrasal types. For instance, center embedding is less preferred than left embedding (i.e. PostGens), and right embedding (i.e. PreGens) is the least preferred option. Second, Verhoeven & Lehmann’s study shows as well that in language use, NPs are less frequently complex (in terms of embedding) than VPs and CPs (cf. also Andrews 2017). The third reason has to do with the function of PreGens in contrast to PostGens. NCs with PreGens are difficult to parse because the PreGen is used as part of a semantic function to identify the entity referred to by the N head, i.e. as a part of the function of the determiner (cf. Haider 1988: 39). For instance, Jacobs in (61a) tells us which convalescence we are referring to, namely one related to Jacob. Adding further PreGens increases the parsing difficulty of such structures. For instance, in (60c) to identify the referent of Witwe ‘widow’ we need to know the referent of Vaters ‘father’s’, then we can identify the referent of Vaters Bruders ‘father’s brother’s’, only then we are able to localise the referent of Witwe ‘widow’.

    1. (61)
    1. a.
    1.   Jacobs
    2.   Jacob.GEN
    1. Genesung
    2. convalescence.NOM
    1. verlief
    2. went
    1. problemlos.
    2. trouble-free
    1.   ‘Jacob’s recovery went smoothly.’
    1.  
    1. b.
    1. *Genesung
    2.   convalescence.NOM
    1. Jacobs
    2. Jacob.GEN
    1. verlief
    2. went
    1. problemlos.
    2. trouble-free

Furthermore, as (62) shows, common nouns can be easily used as PreGens, but unlike proper names or kinship terms, they are almost impossible to find in recursive structures. The question arises: should the grammar rule out these constructions or is there another reason for not finding them? Any grammar allowing PreGens with common nouns (62) and recursive PreGen structures with proper nouns and kinship terms (60) would as well allow recursive PreGen structures with common nouns (regardless of DP or NP analysis).60 The remarks just provided concerning structural complexity give an insight into a possible explanation for the absence of these structures: proper names and relational nouns (in contrast to common nouns) are preferred in such positions as they are easier to interpret – their referents are more salient. That is, the reason we don’t find common nouns in these structures is possibly a performance factor. Therefore, an approach able to analyse these complex recursive constructions is nevertheless needed – and provided in our approach.

    1. (62)
    1. a.
    1. mit
    2. with
    1. [des
    2.   the.GEN
    1. Donners
    2. thunder.GEN
    1. Krachen]
    2. crashing
    1. (DECOW)
    2.  
    1. ‘with the thunder’s crushing’
    1.  
    1. b.
    1. [Des
    2.   the.GEN
    1. Katers
    2. cat.GEN
    1. Leben]
    2. life
    1. bestimmt
    2. determines
    1. meines.
    2. mine
    1. (DECOW)
    2.  
    1. ‘The cat’s life determines mine.’

A further argument for our analysis, particularly for the empty determiner, is provided by the compositionality of the construction.61 As shown in (52), a singular common noun in German needs a determiner (or a quantifier) in order to appear as an argument of a verb. The asymmetry between PreGen and PostGen w.r.t. the necessity of a determiner (61a)–(61b) suggests that they do not have the same function within the NC (cf. Hartmann & Zimmermann’s analysis of PreGens as D heads). On the other hand, it has been shown that PreGens and PostGens can be equally interpreted as arguments of the N head. In our analysis, we guarantee that not all elements in the SPR of N get a theta role assigned (e.g. definite determiners don’t). The possibility of bearing a theta role is regulated by the determiner, e.g. possessive determiner and PreGen with empty determiner can get a theta role. That is, analysing PreGens and PostGens equally would be wrong since the plain semantic combination of a head noun such as Genesung ‘convalescence’ with a PreGen argument such as Jacobs in (61a) would lead to a saturation of the semantic valency of the noun, but not to its determination. That is to say, a semantic operator determining (or quantifying over) the noun would be missing. The semantic analysis we provide is based on the following premises. First, the genitive NPs in prenominal and postnominal position are – at first sight – equal, and the genitive in both cases (in contrast to Partee 1997 for English, Hartmann & Zimmermann 2003 and Olsen 1991 for German) is a marker of case and has no further semantic meaning. Accordingly, the meanings of eines ‘a.GEN’, Mannes ‘man.GEN’, and their combination eines Mannes can be specified as usual (63).

    1. (63)
    1. a.
    1. eines⟧ = λPλQx[P(x) ∧ Q(x)]
    1.  
    1. b.
    1. Mannes⟧ = λx[man(x)]
    1.  
    1. c.
    1. eines Mannes⟧ = λQx[man(x) ∧ Q(x)]

Secondly, to avoid having to assume two different empty determiners (one for definite descriptions in the genitive, another for genitive quantified NCs),62 proper names and other definite descriptions are type-shifted from elements of type e to quantifiers of type ⟨et,e⟩ (64b), following Partee (1987: 121). This can be justified by the possibility of proper names and quantified NPs to be coordinated (64a).

    1. (64)
    1. a.
    1. [[Jacobs]
    2.     Jacob.GEN
    1. oder
    2. or
    1. [eines
    2.   an.GEN
    1. anderen
    2. other.GEN
    1. Mannes]]
    2. man.GEN
    1. Freund
    2. friend.NOM
    1. war
    2. was
    1. da.
    2. there
    1. ‘Jacob’s or another man’s friend was there.’
    1.  
    1. b.
    1. Jacobs⟧ = λQ[Q(jacob)]

Thirdly, as pointed out in Barker (1995: 50–55; 2012: 1114), N heads with PreGens have a relational interpretation (cf. also Szabolcsi 1994: 197). Therefore, not only inherent relational but also non-relational nouns in such a construction need a semantic structure supporting a relational reading. Hence, non-relational nouns need to be type-shifted into relational ones.63 For instance, nouns such as Freund ‘friend’ (65a) and Genesung ‘convalescence’ (65b) are inherently relational, while Fahrrad ‘bicycle’ is type-shifted to relational with an unspecified relation R holding between the entity in the PreGen and the bicycle.

    1. (65)
    1. a.
    1. Freund⟧ = λx λy[friend(x)(y)]
    1.  
    1. b.
    1. Genesung⟧ = λx λy[convalescence (x)(y)]
    1.  
    1. c.
    1. Fahrrad⟧ = λx λy[bicycle(y) ∧ R(x)(y)]

The empty determiner (66) takes care of the semantic composition of PreGen and N head (cf. (67)). By means of function application, it takes first a quantified NP ( ∈ D⟨⟨et⟩,t) as argument (e.g. (63c) or (64b)) giving as a result a function from relational nouns (f ∈ De,⟨e,t⟩⟩) to quantifiers (⟨⟨et⟩,t⟩). Then, the resulting complex determiner (e.g. [eines Mannes ∅]) takes the relational noun as argument resulting in a quantifier that can be combined with a verb.

    1. (66)
    1. ⟦∅⟧ := λQʹλf[λP[(λx[P(σy[f(x)(y)])])]]
    1. (67)

The resulting interpretation of the NC in (67) is a function from predicates to truth values (since it is a quantifier) such that “there is a man (x) who is in a friendship with an individual y and a predicate P applies to y”. The variable y is bound by a sum operator σ – introduced by the determiner – due to possible interactions of the NP with other quantifiers. In other approaches (cf. Partee 1997: 466–467; Hartmann & Zimmermann 2003: 180), it has been proposed to bind the referential variable of the N head with the ι operator, but this interpretation is too restricted. For instance, in (68) there is a possible interpretation of a sum individual of different dreams that was fulfilled, the ι operator would give us the unique dream shared by all men that was fulfilled.64

    1. (68)
    1. Aller
    2. all.GEN
    1. Männer
    2. men.GEN
    1. Traum
    2. dream.NOM
    1. wurde
    2. was
    1. erfüllt.
    2. fulfilled
    1. ‘All men’s dream was fulfilled.’

Similar to Hartmann & Zimmermann (2003: 176),65 we treat the PreGen with the empty Det as semantic functor (cf. (13)). In contrast to their analysis, our solution (i) allows for complex PreGens (e.g. quantified NPs), (ii) is compatible with recursive structures, and (iii) is based on an NP analysis. Furthermore, in our approach, the empty Det (and not the genitive affix) determines which type of semantic objects it can combine with, but the relational semantics is provided by the noun, allowing us to assign theta roles to its arguments and possessives.

6 Alternative NP analyses in HPSG

A different NP analysis in HPSG is based on the head-functor-phrase (cf. Van Eynde 1998; 2006; Allegranza 1998; 2007). In this approach (cf. (69)), the determiner (or PreGen) is not treated as a dependent of N, but as a functor taking the N head and marking it (Allegranza 2007: 260). This analysis is reminiscent of the NC treatment in Categorial Grammar, where the determiner (a functor) selects N licensing an NP/DP (cf. Vennemann & Harlow 1977; Bouma 1988; Steedman 1989; 2000).

    1. (69)

The functor analysis works with the features MARK(ING) and SEL(ECT). The MARK value of a lexical item reflects its degree of saturation, i.e. unmarked, if it needs a determiner to form a complete phrase (e.g. singular common nouns, cf. in (69)), and marked, if it already has a determiner ( in (69)) or is fully saturated by itself (e.g. mass nouns) (Van Eynde 2020a: 7–8; 2006: 166–170). The SEL attribute is similar to SPEC (cf. Section 5.3.3). Its value constrains the type of head a functor combines with. For instance, the functors Jacobs or die in (69) select an unmarked N′ () licensing an NP. Since the head-functor-phrase is a subtype of headed-phrase, the head value of the head daughter (cf. ) is projected (cf. Footnote 30)66 Furthermore, the Marking Principle states that the MARK value of a head-complement phrase (cf. ) is shared with the head daughter, but for head-functor phrases (cf. , ) it is shared with the functor (Van Eynde 2006: 166; 2020a: 10; based on Pollard & Sag 1994: 400).

This approach has the following advantages (Van Eynde 2020b: xiv): First, it simplifies the phrase structural component accounting for head-adjunct and head-specifier phrases only with the head-functor-phrase. Second, it analyses NCs as NPs reflecting the pro NP arguments given here. Third, it eliminates the functional parts of speech (determiner, numeral, auxiliary, etc.), e.g. determiners are defined by means of their MARK and SEL values (Van Eynde 2006: 164). That is, in (69), Jacobs and die are fully saturated, marked “nouns” selecting an unmarked N′.

On the other hand, the prenominal position in German poses several challenges for the functor analysis. For instance, assuming the combination of argumental PreGen and noun licensed by the head-functor-phrase, N is not selecting the functor, hence it cannot determine the PreGen’s theta role (cf. (24b), (24c), (26)), and since functors cannot modify the valency of the head (Abeillé et al. 2004: 23), the PreGen cannot be taken to be an argument of N. Furthermore, in the functor analysis, possessive determiners provide the possessive semantics (Van Eynde 2020a: 12; see also Ginzburg & Sag 2000: 399): the functor selects N determining its referent as possessed. But possessive determiners and PreGens have a double life (cf. Section 5.3.2). They can be interpreted with different theta roles depending on the noun or with a possessive role (see LR (51)). To account for that (cf. (49a)), homonymous determiners would have to be proposed. Moreover, it would not be clear how a prenominal possessive can be interpreted as a patient, but only if the agent or possessor are not realised postnominally (cf. (49b)). To account for these data, the head would need access to the functor, leading to a double selection as proposed here (N selects DetP through SPR, Det selects N through SPEC), or to non-local selection, i.e. the functor would have to see which nominal arguments have already been realised (cf. (20) and its discussion).

A further problem of this account concerns the quantification of N. To avoid empty determiners, the functor analysis proposes that plural and mass nouns, i.e. elements that can appear without (overt) determiners (70a), have a lexically inherent (existential) quantifier (Allegranza 1998: 99–103). But since plural and mass nouns can combine with a determiner, this account is forced to delete the inherent quantifier when an overt determiner appears (70b) (cf. Allegranza 1998: 104 for technical details).67 Besides difficulties w.r.t. compositionality arising by deleting semantic material through syntactic combination (cf. Copestake et al. 2001; Bender et al. 2015), this account would also have problems with nouns in predicative structures (70c), where nouns are generally assumed to denote predicates and not entities or quantified elements.68 That is, the quantifier must be deleted as well, but not through addition of a different determiner as proposed in Allegranza (1998: 104).

    1. (70)
    1. a.
    1. I have to buy coffee.
    1.  
    1. b.
    1. I have to buy my coffee.
    1.  
    1. c.
    1. This beverage is coffee and that one is tea.

Summarising, the functor approach provides a simplification of the grammatical component removing empty determiners, making the part of speech classification more parsimonious, and simplifying the phrase structural component.69 Nevertheless, the economical advantage of this approach decreases with the difficulties it encounters: different deletion rules for quantifiers must be added at a syntactic level, several possessive determiners with different theta roles must be included to the lexicon, pre and postnominal asymmetries are not accounted for, etc. In our approach, we handle these difficulties straightforwardly. The price we have to pay in some cases is a phonetically empty determiner, which is in any case empirically justified on the basis of the need for a quantifier.70 Furthermore, our (specifier) approach reflects the parallelism between sentences and NCs (2) since the DetP is treated as part of the valency of N, in contrast to the functor approach.

Another possibility to account for NPs in HPSG, proposed to us by a reviewer, is to analyse PreGens as raised arguments by means of weak heads. Weak heads are lexical heads introduced to deal e.g. with the non-prepositional uses of de in French, i.e. de-N′ combinations that do not behave as PPs, but as NPs, such as in beaucoup de livres ‘a lot of books’ (cf. Abeillé et al. 2004: 9; 2006: 156). Adapting this concept to NC structures, determiners would be analysed as weak heads.

    1. (71)

In (71), the weak head would have to be an empty element71 that combines with a noun by means of the head-complement-phrase, so no further phrasal constraints (e.g. head-marker or head-functor-phrase) have to be added. Similar to the functor approach, weak heads use the MARK feature and the Marking Principle. The weak head selects for an unmarked N′ (cf. and ) with an empty COMPS list and projects its own MARK value, i.e. marked (). The peculiarity of weak heads is that they adopt the HEAD (), SPR (), and CONT values of their complements (). Therefore, when a weak head selects for a noun, the projected phrase is an NP. Furthermore, if the noun selected for a specifier, after the combination with the weak head, the new phrase selects for the same specifier () rendering a raising analysis of PreGens, i.e. weak heads are – per definition – subject raisers (Abeillé et al. 2006: 156).72 In the case of NCs with PreGens, the weak head has to be realised by an empty element, because weak heads are lexical heads (Abeillé et al 2006: 156), i.e. it is not possible to assume that (complex) prenominal genitive phrases are weak heads. Similar to the empty Det in (55), it would have its own semantic contribution and combine it with the semantic contribution of N.

But a weak head analysis does not work for NCs without changes in the central concept of weak heads. First, XARG is needed and a weak head has to raise its value, and not the element in SPR. Assuming that a noun has its PreGen in SPR, would lead to PreGen-noun structures (licensed through head-specifier-phrase) in which the noun is semantically not determined. Second, in the case of PreDats, the weak head cannot just raise the specifier of the noun, but it has to assign dative to it. Third, w.r.t. possessive determiners, the weak head itself needs an XARG value (cf. (56) and its discussion). A further (minor) problem concerns the parallel between NCs and sentences, since subjects are not expected to take the verb as a syntactic complement. A major problem of a raising analysis in general concerns constituency, since it implies a constituent structure in which the determiner combines with the noun, and the PreGen/PreDat combines with the resulting phrase (cf. (18)). As (72) shows, regardless of their complexity, the prenominal elements can occupy the same position and have a complementary distribution (10). This strongly suggests that the prenominal complex is a constituent (cf. Kim 2020: 54).

    1. (72)
    1. {dem
    2.   the.DAT
    1. Basti
    2. Basti.DAT
    1. seine
    2. his
    1. /
    2.  
    1. Basti-s
    2. Basti-GEN
    1. /
    2.  
    1. die
    2. the
    1. /
    2.  
    1. meine}
    2. my
    1. Behandlung
    2. treatment
    1. ‘Basti’s / Basti’s / the / my treatment’

The constituent structure we propose (58) takes this into account combining PreGen and empty Det first, and the complex DetP with N later. NP approaches in HPSG (cf. Pollard & Sag 1994: 53; Ginzburg & Sag 2000: 193; Kim 2020: Section 3.1; for English), but also some DP analyses for German (Karnowski & Pafel 2004: 181–184; Hartmann & Zimmermann 2003: 180; a.o.) support this structure reflecting the distributional properties of the prenominal constituent and the empirical fact that PreGen and empty Det together determine the denotation of N. Therefore, our approach accounts for the data leading to a more parsimonious analysis. Determiners and complex prenominal structures are accounted for with the same syntactic machinery, the parallel between NPs and sentences still holds, and the constituent structure supported by data and literature is reflected in our analysis.

7 Conclusions

First, we have given an outline of how the DP analysis developed (Section 2). The history of the DP analysis helps us understand how changes in the theoretical axioms of a framework lead to changes in the analysis of NCs. This is particularly important as this development generates our current research question: DP or NP. In Section 3, we discussed some DP analyses for German that take into account the realisation of PreGens (Olsen 1991; Hartmann & Zimmermann 2003; Sternefeld 2009; 2015; a.o.). We focussed on (i) the type of element selected as determiner, (ii) how theta roles are assigned and which assumptions are needed to account for arguments inside NCs, and (iii) which problems a theory of local selection (Sag 2007; 2012) encounters in DP analyses. In Section 4, we have given a brief HPSG introduction in order to show the differences (w.r.t. MGG) that allow a local NP analysis. As our analysis shows, most of the tasks functional heads in MGG deal with are handled in the AVMs of lexical heads, enabling more surface-oriented approaches. In Section 5, we provided a detailed and formalised analysis of German NPs. First (Section 5.1), we concentrated on nominalisation of verbal stems and the (verb to noun) argument inheritance accounting for the mapping between argument structure and syntactic valency of lexemes. Second (Section 5.2), we presented the generalisation for case assignment in the verbal and nominal domain taking also the genitive vs. von-PP variation in German into account. Third (Section 5.3), concentrating on the prenominal position, we showed a similar structure in some German dialects: the prenominal dative with possessive. Moreover, we motivated the analysis of possessive elements as nominal arguments accounting for a uniform treatment of argumental NPs and possessives, hence giving an adequate description of the co-occurrence and constituent order restrictions. Fourth (Section 5.4 and 5.5), we showed how the syntactic and semantic combination is modelled. We provided the phrasal constraints needed to license the structures at hand only recurring to the standard phrasal constraints used in the framework. Furthermore, we developed a semantic account able to provide the interpretation of PreGens also in complex and recursive structures as they are found in corpora. In Section 6, we discussed an alternative NP analysis in HPSG: the functor approach, and discussed the possibility to account for NPs with a raising analysis based on weak heads. We compared the advantages and drawbacks of both analyses and concluded that our analysis could best explain the data. As a reviewer mentioned, the DP vs. NP debate does not necessarily concern headedness alone, but it comprises aspects of a functor-argument relation as well and this is exactly what our analysis reflects. On the one hand, the N head projects its properties determining its phrasal distribution. Furthermore, it determines which elements it can be combine with, which position these elements can have, and how they are interpreted. On the other hand, the determiner is acting as a functor on a semantic level (cf. Copestake et al. 2001: 145) taking PreGen and N head as arguments (cf. (67)). That is, our approach allows the double selection (Det selects N, N selects DetP)73 needed to account for the data presented without neglecting the head status of N, which is the central question of the DP-NP debate.

Notes

1We use NC as theory neutral concept for structures called DP in some theories and NP in others.

2Most examples are taken from the DECOW corpus (www.webcorpora.org).

3We subsume frameworks in a Chomskyan tradition under the label Mainstream Generative Grammar (MGG), i.e. Transformational Grammar, Government-Binding Theory, and Minimalism, a.o. (cf. Chomsky 1970; 1981; 1995).

4We use the term external argument for ease of understanding. We do not assume external arguments to be licensed outside the phrase with the selecting lexical head.

5For explanatory reasons, the figures taken from Chomsky (1970: 211) have been simplified.

6The subject-of and object-of relations in MGG approaches are configurationally defined relations, not primitives of the grammar (cf. Speas 1990; 7–8) i.e. they describe positions in a (tree-)structure.

7See Abney (1987: 51) and Hewson (1991: 317) for further literature.

8See for instance Hudson (1987; 1990); Hewson (1991); Netter (1994); Bresnan et al. (2016) for DP analyses in Word Grammar, Cognitive Grammar, HPSG, and LFG respectively.

9Abandoning the IP/DP parallel does not entail neglecting the parallel between NCs and sentences, it just has to be modelled differently. See e.g. Szabolcsi (1983; 1994) for the parallel between CP and NP/DP, and Sag et al. (2003: 64) for an HPSG proposal. See also Salzmann (2020: 8–13) for a critical review of the parallelism argument from a MGG perspective.

10The concept of a rich UG is still pursued in Cartographic approaches (cf. Cinque & Rizzi 2010). For discussion, see Müller (2015a: 39–40).

11Abney (1987: 51–56) discusses four analyses for ’s, all of them assuming a DP analysis. Olsen’s analysis parallels the one treating ’s as D head. It is worth mentioning that Abney’s analysis does not make a distinction between PreGens as arguments, modifiers, or possessives (cf. Footnote 12).

12Olsen (1991: 49) explicitly talks about possessive (and not argumental) PreGens, although the argumental vs. non-argumental distinction in her examples is not quite clear (cf. Section 5.3.2).

13See also Di Sciullo & Williams (1988: 78–88) for this type of phrase-to-head reanalysis.

14See Zwicky (1987) and Anderson (2008) for analyses of ’s as inflectional affix or a special clitic, respectively. However, both accounts advocate a morphological analysis of ’s and not as a syntactic head. This depends of course on the treatment of functional content (cf. Anderson 2008: 18). In MGG approaches, functional content tends to induce syntactic structure, while in HPSG as well as in Zwicky (1987) and Anderson (2008) this is not necessarily the case.

15For further details w.r.t. nominal inflection in German (and Dutch) in correlation with determiners, see Netter (1994) and Van Eynde (2006).

16Nachbar ‘neighbour’ is a noun with weak inflection. Hence, its paradigm in singular is as follows: NOM: Nachbar, ACC: Nachbarn, DAT: Nachbarn, GEN: Nachbarn.

17For similar proposals, see Haider (1988: 56), Georgi & Salzmann (2011: 2074) for German; Adger (2003: 257–258) for English; Szabolcsi (1994: 214) for Hungarian. Karnowski & Pafel (2004: 181–184)’s approach shows similarities to Sternefeld’s and Hartmann & Zimmermann’s, they propose the combination of a DP in the genitive with an empty determiner forming together a new D head.

18For a similar view but a different conclusion, see also Sportiche (2005: 41).

19See Sportiche (2005: 75–77) and Bruening (2009: 29) for different accounts of the English data.

20It is worth mentioning that locality was part of analyses at the beginning of X-bar theory as (3)–(6) show. Later analyses, assuming further functional projections, abandon this type of locality assumption or at least have to reinterpret it, expanding the locality domains (cf. Grimshaw 1991).

21See also Abney (1986: 16–18). It should be mentioned though that functional elements are not supposed to assign theta roles (cf. Chomsky & Lasnik 1993: 528; Alexiadou et al. 2007: 15).

22To which extent arguments are (not) inherited by deverbal nominalisation is a controversial topic (cf. Bierwisch 1989; Grimshaw 1990: Chapter 3; Kratzer 1996; Alexiadou et al. 2007: Part IV; Alexiadou 2010; Bücking 2010; 2012: 86–128; Machicao y Priemer 2017: Chapter 4.6; a.o.). We cannot address this issue in its entirety here, but we are assuming that all verbal arguments are inherited by the derived noun, as the examples in (24) and (26) strongly suggest.

23In the CP/DP parallel proposed in Szabolcsi (1983; 1994) for Hungarian, case assignment does not motivate the movement of the prenominal constituent to the specifier of DP. See Footnotes 40 and 45 for further details.

24See also Georgi & Müller (2010) for a Minimalist NP analysis based on reprojection of the N head.

25The type of a complex value is located at the top of the respective AVM.

26The Saussurean sign comprises two sides of linguistic objects: sound (signifiant) and meaning (signifié) (cf. Saussure 1916: 76–82). The notion of sign in HPSG goes further, comprising as well aspects of morphological, syntactic, and contextual nature (cf. Sag 2012: 71; 74–75).

27Empty elements are only adopted if there is strong language internal evidence for them. A derivational concept such as movement is not used in HPSG (for a similar position in LFG, see Bresnan et al. 2016: 91, 210). Phenomena such as non-local dependencies have to be dealt with using other descriptive mechanisms such as structure sharing and head-filler constraints, a.o. (cf. Pollard & Sag 1994; 2 Sag 1997; Müller 2015b: 944–956; Müller & Machicao y Priemer 2019: 336–339).

28The value of PHON is a list of phonemes, but for ease of readability, we represent it as an orthographic form. In HPSG, lists are written in angle brackets: ⟨ ⟩. The elements of lists are ordered and separated by commas, see also the values of ARG-ST and RELS.

29In (30), we are just giving the attribute-value pairs relevant for our case, i.e. we do not provide the non-local part of the structure. SYNSEM | LOC is the path to the value of LOC.

30The Head Feature Principle states that in a headed phrase the HEAD value of the head daughter has to be identical to the HEAD value of the phrase.

31The HEAD value of nouns is actually complex, but its internal structure is not relevant here.

32The elements in the ARG-ST list in (30) are just abbreviations for feature descriptions. NP[str]1 stands for the feature description of a completely saturated nominal element (cf. Footnote 44), i.e. it could be a non-projecting element (like a proper name), or an element with all its syntactic dependents realised. This “NP” has structural case (noted as [str]) (see Section 5.2 for case assignment) and its INDEX value is whatever 1 stands for.

33Since we are dealing with argument realisation, we cannot go into the details of the different readings of -ung or its morphosyntactic properties, which are also provided by means of the LR. See Bierwisch (1989; 2009); Ehrich & Rapp (2000); and Dolling (2015).

34Since the elements of the ARG-ST list are ordered according to the Accessibility Hierarchy, the first NC with structural case corresponds to the external, the second one to the internal argument. Furthermore, since N heads can also have sentential arguments, these could be combined with the PP arguments in 4. For the sake of clarity, however, we will not consider sentential arguments.

35It has been proposed that some nominal arguments are obligatory (cf. Grimshaw 1990: 49–54; Szabolcsi 1994: 232; Bucking 2010: 41; Barker 2012: 1111–1112).

36We mark optional arguments with ( ), (cf. (34)). See Jacobs (1994); Flickinger (2000: 22); De Kuthy & Meurers (2003); and Machicao y Priemer (2017: 178–230), for optionality accounts.

37We are concentrating on arguments with structural case, hence we simplify (34) leaving aside list(lex) and list(pp) mentioned before.

38The mapping constraints could also be incorporated in the LR (32), but they apply also to other types of nominalisation, e.g. to the German infinitive nominalisation: das Kaufen ‘the buying’, showing the same argument inheritance. Therefore, it is better to give the set of mapping constraints a name, enabling to refer to them (i.e. inherit from the type) in various LRs.

39Following Ginzburg & Sag (2000: 190), Kim (2020: 55), a.o., we use DetP as the phrase headed by a determiner (Det) in the specifier of a noun, hence different from the concept of DP.

40The CP/DP parallel proposed by Szabolcsi (1994) for Hungarian is not applicable for German either. First, in her analysis, the external argument with structural case is realised in the specifier of NP, i.e. after the (overt or silent) determiner. This DP analysis would force that NCs with PreGens in German have a silent D selecting an NP. But since arguments of German NCs are optional, the assumed empty determiner could be realised also with an NP without PreGen, leading to an ungrammatical NC in German (31f). Second, the specifier of DP in Hungarian is equated with the specifier of CP building on the extractability from this position, furthermore elements in SpecDP bear dative, i.e. not structural, case. In German, these elements are realised with structural case and cannot be extracted (see also Section 5.3.1).

41That one argument is not in the ARG-ST list does not imply to delete it from the semantics of the sign. Mauner & Koenig (1999) have shown that the unexpressed agent in passive constructions is present in the lexical representation of the verb rather than derived from conceptual sources.

42The simplification of the Case Principle in (38) concerns cases of “raising”, which need a special treatment. For further details, cf. Meurers (1999) and Przepiorkówski (1999).

43See Smith (2003) and Verhoeven & Lehmann (2018) and the literature cited therein for more information about register dependency and morphosyntactic factors of this variation.

44DetP is an abbreviation for a (simple or complex) syntactically saturated linguistic object of type synsem with HEAD value determiner (cf. Footnotes 32 and 39).

45Szabolcsi (1983; 1994) describes a similar construction in Hungarian. In contrast to German, the Hungarian dative phrase can be extracted. Furthermore, Szabolcsi (1994: 203) does not analyse the dative morpheme in this construction as case marker. For German, there is no reason to assume that the phrase is not case marked.

46Nouns in German cannot assign dative case to their arguments. That is a strong indicator for the case assignment by the determiner.

47See for instance the concept of proto-roles in the nominal domain in Barker & Dowty (1993).

48For a similar suggestion in LFG, see Bresnan et al. (2016: 315–316).

49A different approach is postulated in Sportiche (2005: 40–44) where NPs (not DPs) are assumed to be base generated in their argument positions being then moved to a position where they form a constituent with the D head (a DP).

50See Section 6 for a discussion of the functor approach and Netter (1994: 310–312) for a DP proposal in HPSG based on functional completeness.

51We will address the semantics of the empty determiner in Section 5.5.

52XARG (or EXT-ARG) as proposed in Sag & Pollard (1991: 89–92) has been implemented into Minimal Recursion Semantics in Copestake et al. (2001: 146) as a purely semantic attribute. XARG – as we use it – takes a synsem object as value.

53See also: “it would be desirable to use the same feature to make genitive pronouns that are realized within a given NP available for selection by elements outside that NP” (Sag 2007: 408).

54The argument-structure mapping in Section 5.1 together with the constraints posit by the specific determiners can account for the ungrammaticality of (10a)–(10b) since the definite determiner does not allow the combination with a genitive, and also the ungrammaticality of (31f)–(31h) since the empty determiner requires the genitive NC as its specifier.

55Recall that the possessive relation is added to the noun by means of the LR (51) (cf. also Szabolcsi 1994: 197, and for different HPSG accounts Demske 2001: 249 and Müller 1999).

56PreDats (cf. Section 5.3.1) can be analysed as dative NPs in the SPR of a possessive Det (in some German dialects) sharing their INDEX value with Det, due to co-reference. Through structure sharing of INDEX values, the PreDat gets the possessive theta role from the noun.

57Take into account that the constraint (59) states that all elements in COMPS must be saturated (see empty list). The elements of SPR are therefore “higher” arguments than the ones of COMPS.

58As mentioned in Footnote 14, the status of English ’s is a controversial topic. In our representation of ’s as syntactic head, we follow Pollard & Sag (1994: 53) and Ginzburg & Sag (2000: 192), but our analysis concentrates on the German data.

59URL: http://www.tautoo.de/galerie.html, accessed: 15/04/2020.

60A grammar ruling out recursive PreGens with common nouns would have (i) to impose constraints for different subtypes of nouns (by means of underspecification this can be easily done in HPSG) and (ii) to count embedding levels in order to allow (62) but penalising every single further embedding, e.g. des Donners Schlags Krachen ‘the crushing of the thunder’s strike’.

61For convenience, we provide the following semantic representations as lambda terms, for a similar approach, see Sag et al. (2020: 103).

62Take into account that we do not need either an empty determiner for possessive and another for argumental readings (cf. (17) and (18)), since the relation is provided by the noun.

63This can be achieved with the type-shifter π (Barker 2012: 1114), π = λPλxλy[P(y) ∧ R(x)(y)].

64See Krifka (1996), Yoon (1996), and Champollion & Krifka (2016) for more about sum individuals.

65See also Partee (1997) for a similar proposal.

66This is a major distinction from Categorial Grammar approaches analysing D as functor and head, cf. Vennemann & Harlow (1977: 246) vs. Bouma (1988: 36–38).

67This is also a problem for Netter’s (1994) DP account. His account based on functional completeness is syntactically interesting w.r.t. determinerless DPs, but it is semantically opaque.

68See Van Eynde (2015: 158–163) for a different treatment of predicative nouns. However, further difficulties arise by the combination of inherently quantified nouns with modal or privative adjectives, which do not assert the existence of the entity denoted by the noun (cf. Müller 2020).

69Take into account that eliminating the head-specifier-phrase in the nominal domain does not entail to simplify the phrase structural component in general, since it would have to be used for subjects in SVO languages (cf. Sag et al. 2003: 100–103; Müller & Machicao y Priemer 2019: 327). Another possibility is to assume a head-subject-phrase instead (cf. Van Eynde 2020b: xvii; Allegranza 2007: 261), but that is not more economical either.

70See also Müller (2013: 87) for further arguments. It is worth mentioning that empty elements can be replaced by unary rules, for details see Müller (2020: Section 5).

71For PreDats, the possessive determiner would be the weak head rendering the raising of a constituent.

72In Abeillé et al.’s (2004; 2006) weak head analysis, three valency lists are used: SUBJ, SPR, and COMPS. For explanatory purposes, we have adapted this to our model with only two lists: SPR and COMPS (cf. Müller & Machicao y Priemer 2019: 327). The raising problem remains the same.

73See also the specifier treatment of determiners in Categorial Unification Grammar (Bouma 1988).

Abbreviations

ACC = accusative, DAT = dative, F = feminine, GEN = genitive, NOM = nominative, PL = plural, SG = singular.

Acknowledgements

We want to thank the participants of the HPSG conference 2018, the 5th European Workshop on HPSG 2018, the DeMiNes Summer School 2019, the Workshop “New horizons in the study of nominal phrases” (especially the organizers Anke Holler and Andreas Blümel), and the Syntax-Semantik-Kolloquium at the Humboldt-Universität zu Berlin. Particularly, we want to thank Sebastian Bücking, Frank Van Eynde, Elisabeth Verhoeven, Berry Claus, and specially Manfred Krifka and three anonymous reviewers for discussions, comments and suggestions, that have improved substantially the quality of this paper. All remaining errors are ours.

Funding information

This research has been partly funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) – SFB 1412, 416591334.

Competing interests

The authors have no competing interests to declare.

References

  1. Abeillé, Anne, Olivier Bonami, Danièle Godard & Jesse Tseng. 2004. The syntax of French de-N′ phrases. In Stefan Müller (ed.), The 11th International Conference on Head-Driven Phrase Structure Grammar, Katholieke Universiteit Leuven, 6–26. Stanford, CA: CSLI Publications. 

  2. Abeillé, Anne, Olivier Bonami, Danièle Godard & Jesse Tseng. 2006. The syntax of French à and de: An HPSG analysis. In Patrick Saint-Dizier (ed.), Syntax and semantics of prepositions, 147–162. Dordrecht: Springer. DOI: https://doi.org/10.1007/1-4020-3873-9 

  3. Abney, Steven P. 1986. Functional elements and licensing. Paper presented at Generative Linguistics in the Old World, Gerona, Spain in 1986. 

  4. Abney, Steven P. 1987. The English noun phrase in its sentential aspect. Cambridge, MA: MIT dissertation. 

  5. Adger, David. 2003. Core syntax: A minimalist approach. Oxford: Oxford University Press. 

  6. Alexiadou, Artemis. 2010. Nominalizations: A probe into the architecture of grammar. Part I: The nominalization puzzle. Language and Linguistics Compass 4(7). 496–511. DOI: https://doi.org/10.1111/j.1749-818X.2010.00209.x 

  7. Alexiadou, Artemis, Liliane Haegeman & Melita Stavrou. 2007. Noun phrase in the generative perspective. Berlin: Mouton de Gruyter. DOI: https://doi.org/10.1515/9783110207491 

  8. Allegranza, Valerio. 1998. Determiners as functors: NP structure in Italian. In Sergio Balari & Luca Dini (eds.), Romance in HPSG, 55–107. Stanford, CA: CSLI Publications. 

  9. Allegranza, Valerio. 2007. The signs of determination: Constraint-based modeling across languages. Frankfurt am Main: Peter Lang. 

  10. Anderson, Stephen R. 2008. The English “group genitive” is a special clitic. English Linguistics 25(1). 1–20. DOI: https://doi.org/10.9793/elsj1984.25.1 

  11. Andrews, Avery. 2017. Prenominal possessives in English: What does the stimulus look like? Ms. https://ling.auf.net/lingbuzz/003568 

  12. Baltin, Mark R. 1989. Heads and projections. In Mark R. Baltin & Anthony S. Kroch (eds.), Alternative conceptions of phrase structure, 1–16. Chicago: University of Chicago Press. 

  13. Barker, Chris. 1995. Possessive descriptions. Stanford, CA: CSLI Publications. 

  14. Barker, Chris. 2012. Possessive and relational nouns. In Klaus von Heusinger, Claudia Maienborn & Paul Portner (eds.), Semantics: An international handbook of natural language meaning (Handbooks of Linguistics and Communication Science 33.2), 1109–1130. Berlin: De Gruyter Mouton. DOI: https://doi.org/10.1515/9783110255072.1109 

  15. Barker, Chris & David Dowty. 1993. Non-verbal thematic proto-roles. In Amy Schäfer (ed.), 23rd Annual Meeting of the North East Linguistic Society, 49–62. University of Massachusetts, Amherst, GLSA. 

  16. Barwise, Jon & Robin Cooper. 1981. Generalized quantifiers in natural language. Linguistics and Philosophy 4(2). 159–219. DOI: https://doi.org/10.1007/BF00350139 

  17. Bayer, Josef & Jaklin Kornfilt. 1989. Restructuring effects in German. DYANA Report University of Edinburgh. 

  18. Bender, Emily, Dan Flickinger, Stephan Oepen, Woodley Packard & Ann Copestake. 2015. Layers of interpretation: On grammar and compositionality. In Matthew Purver, Mehrnoosh Sadrzadeh & Matthew Stone (eds.), 11th International Conference on Computational Semantics, 239–249. London: Association for Computational Linguistics. 

  19. Bhatt, Christa. 1990. Die syntaktische Struktur der Nominalphrase im Deutschen [The syntactic structure of nominal phrases in German]. Tübingen: Gunter Narr. 

  20. Bierwisch, Manfred. 1989. Event nominalization: Proposals and problems. In Wolfgang Motsch (ed.), Wortstruktur und Satzstruktur, 1–73. Berlin: Akademie der Wissenschaften der DDR, Zentralinstitut für Sprachwissenschaft. 

  21. Bierwisch, Manfred. 2009. Nominalization: Lexical and syntactic aspects. In Anastasia Giannakidou & Monika Rathert (eds.), Quantification, definiteness, and nominalization, 281–320. Oxford: Oxford University Press. 

  22. Bildhauer, Felix. 2014. Head-Driven Phrase Structure Grammar. In Andrew Carnie, Yosuke Sato & Dan Siddiqi (eds.), The Routledge handbook of syntax, 526–555. Oxford: Routledge. DOI: https://doi.org/10.4324/9781315796604 

  23. Bouma, Gosse. 1988. Modifiers and specifiers in Categorial Unification Grammar. Linguistics 26(1). 21–46. DOI: https://doi.org/10.1515/ling.1988.26.1.21 

  24. Brame, Michael. 1982. The head-selector theory of lexical specifications and the nonexistence of coarse categories. Linguistic Analysis 10(4). 321–325. 

  25. Bresnan, Joan W., Ash Asudeh, Ida Toivonen & Stephen M. Wechsler. 2016. Lexical-functional syntax. Oxford: Blackwell Publishers Ltd. 2nd edn. DOI: https://doi.org/10.1002/9781119105664 

  26. Bruening, Benjamin. 2009. Selectional asymmetries between CP and DP suggest that the DP hypothesis is wrong. In Laurel MacKenzie (ed.), Proceedings of the 32nd Annual Penn Linguistics Colloquium, 26–35. Philadelphia, PA: University of Pennsylvania. 

  27. Bruening, Benjamin. 2020. The head of the nominal is N, not D: N-to-D movement, hybrid agreement, and conventionalized expressions. Glossa: A Journal of General Linguistics 5(1). 1–19. DOI: https://doi.org/10.5334/gjgl.1031 

  28. Bücking, Sebastian. 2010. Zur Interpretation adnominaler Genitive bei nominalisierten Infinitiven im Deutschen [On the interpretation of adnominal genitives in German nominalised infinitives]. Zeitschrift für Sprachwissenschaft 29(1). 39–77. DOI: https://doi.org/10.1515/zfsw.2010.002 

  29. Bücking, Sebastian. 2012. Kompositional flexibel: Partizipanten und Modifikatoren in der Nominaldomäne [Compositionally flexible: Participants and modifiers in the nominal domain]. Tübingen: Stauffenburg. 

  30. Champollion, Lucas & Manfred Krifka. 2016. Mereology. In Maria Aloni & Paul Dekker (eds.), The Cambridge handbook of formal semantics, 369–388. Cambridge: Cambridge University Press. DOI: https://doi.org/10.1017/CBO9781139236157.014 

  31. Chierchia, Gennaro. 1998. Reference to kinds across languages. Natural Language Semantics 6(4). 339–405. DOI: https://doi.org/10.1023/A:1008324218506 

  32. Chomsky, Noam. 1965. Aspects of the theory of syntax. Cambridge, MA: MIT Press. DOI: https://doi.org/10.21236/AD0616323 

  33. Chomsky, Noam. 1970. Remarks on nominalization. In Roderick A. Jacobs & Peter S. Rosenbaum (eds.), Readings in English transformational grammar, 184–221. Waltham, MA: Ginn & Company. 

  34. Chomsky, Noam. 1981. Lectures on government and binding: The Pisa lectures. Dordrecht: Foris Publications. DOI: https://doi.org/10.1515/9783110884166 

  35. Chomsky, Noam. 1986. Barriers. Cambridge, MA: MIT Press. 

  36. Chomsky, Noam. 1995. The minimalist program. Cambridge, MA: MIT Press. 

  37. Chomsky, Noam. 2007. Approaching UG from below. In Uli Sauerland & Hans-Martin Gärtner (eds.), Interfaces + recursion = language?: Chomsky’s Minimalism and the view from syntax-semantics, 1–30. Berlin: De Gruyter. DOI: https://doi.org/10.1515/9783110207552 

  38. Chomsky, Noam, Ángel J. Gallego & Dennis Ott. 2019. Generative grammar and the faculty of language: Insights, questions, and challenges. Ms. To appear in Catalan Journal of Linguistics. https://ling.auf.net/lingbuzz/003507 

  39. Chomsky, Noam & Howard Lasnik. 1993. The theory of principles and parameters. In Joachim Jacobs, Arnim von Stechow, Wolfgang Sternefeld & Theo Vennemann (eds.), Syntax: An international handbook of contemporary research (Handbooks of Linguistics and Communication Science 9.1), 506–569. Berlin: Walter de Gruyter. DOI: https://doi.org/10.1515/9783110095869.1.toc 

  40. Cinque, Guglielmo & Luigi Rizzi. 2010. The cartography of syntactic structures. In Bernd Heine & Heiko Narrog (eds.), The Oxford handbook of linguistic analysis, 51–65. Oxford: Oxford University Press. DOI: https://doi.org/10.1093/oxfordhb/9780199544004.001.0001 

  41. Copestake, Ann, Alex Lascarides & Dan Flickinger. 2001. An algebra for semantic construction in constraint-based grammars. In Emily Bender, Dan Flickinger, Frederik Fouvry & Melanie Siegel (eds.), 39th Meeting of the Association for Computational Linguistics, 140–147. Toulouse: Association for Computational Linguistics. DOI: https://doi.org/10.3115/1073012.1073031 

  42. Copestake, Ann, Dan Flickinger, Carl Pollard & Ivan Sag. 2005. Minimal recursion semantics: An introduction. Research on Language and Computation 3(2–3). 281–332. DOI: https://doi.org/10.1007/s11168-006-6327-9 

  43. Davis, Anthony & Jean-Pierre Koenig. 2000. Linking as constraints on word classes in a hierarchical lexicon. Language 76(1). 56–91. DOI: https://doi.org/10.2307/417393 

  44. De Kuthy, Kordula & Detmar Meurers. 2003. Dealing with optional complements in HPSG-based grammar implementations. In Stefan Müller (ed.), The 10th International Conference on Head-Driven Phrase Structure Grammar, Michigan State University, 88–96. Stanford, CA: CSLI Publications. 

  45. de Saussure, Ferdinand. 1916. Cours de linguistique générale [Basic questions of general linguistics]. Paris: Payot. DOI: https://doi.org/10.1515/9783111484327. Lectures notes ed. by Charles Bally and Albert Sechehaye. Translated into German and published in 1967 as Grundfragen der allgemeinen Sprachwissenschaft, Berlin: Walter de Gruyter. 

  46. Demske, Ulrike. 2001. Merkmale und Relationen: Diachrone Studien zur Nominalphrase des Deutschen [Features and relations: Diachronic studies on German NPs]. Berlin: Walter de Gruyter. DOI: https://doi.org/10.1515/9783110811353 

  47. Di Sciullo, Anna Maria & Edwin Williams. 1988. On the definition of word. Cambridge, MA: MIT Press. 

  48. Dölling, Johannes. 2015. Sortale Variation der Bedeutung bei ung-Nominalisierungen [Sortal variation of meaning in ung-nominalizations]. In Christian Fortmann, Anja Lübbe & Irene Rapp (eds.), Situationsargumente im Nominalbereich, 49–92. Berlin: de Gruyter Mouton. DOI: https://doi.org/10.1515/9783110432893-003 

  49. Ehrich, Veronika & Irene Rapp. 2000. Sortale Bedeutung und Argumentstruktur: ung-Nominalisierungen im Deutschen [Sortal meaning and argument structure: ung-nominalizations in German]. Zeitschrift für Sprachwissenschaft 19(2). 245–303. DOI: https://doi.org/10.1515/zfsw.2000.19.2.245 

  50. Flickinger, Dan. 2000. On building a more efficient grammar by exploting types. Natural Language Engineering 6(1). 15–28. DOI: https://doi.org/10.1017/S1351324900002370 

  51. Georgi, Doreen & Gereon Müller. 2010. Noun-phrase structure by reprojection. Syntax 13(1). 1–36. DOI: https://doi.org/10.1111/j.1467-9612.2009.00132.x 

  52. Georgi, Doreen & Martin Salzmann. 2011. DP-internal double agreement is not double Agree: Consequences of Agree-based case assignment within DP. Lingua 121(14). 2069–2088. DOI: https://doi.org/10.1016/j.lingua.2011.07.010 

  53. Ginzburg, Jonathan & Ivan Sag. 2000. Interrogative investigations: The form, meaning, and use of English interrogatives. Stanford, CA: CSLI Publications. 

  54. Grimshaw, Jane. 1990. Argument structure. Cambridge, MA: MIT Press. 

  55. Grimshaw, Jane. 1991. Extended projection. Ms. Brandeis University. 

  56. Haider, Hubert. 1985. The case of German. In Jind˘rich Toman (ed.), Studies in German grammar, 65–101. Berlin: De Gruyter Mouton. DOI: https://doi.org/10.1515/9783110882711 

  57. Haider, Hubert. 1988. Die Struktur der deutschen Nominalphrase [The structure of German NPs]. Zeitschrift für Sprachwissenschaft 7(1). 32–59. DOI: https://doi.org/10.1515/ZFSW.1988.7.1.32 

  58. Haider, Hubert. 1993. Deutsche Syntax – generativ: Vorstudien zur Theorie einer projektiven Grammatik [German syntax – generative: Preliminary studies on the theory of a projective grammar]. Tübingen: Gunter Narr Verlag. 

  59. Hartmann, Katharina & Malte Zimmermann. 2003. Syntactic and semantic adnominal genitive. In Claudia Maienborn (ed.), (A-)symmetrien – (A-)symmetries: Beiträge zu Ehren von Ewald Lang, 171–202. Tübingen: Stauffenburg. 

  60. Hauser, Marc D., Noam Chomsky & W. Tecumseh Fitch. 2002. The faculty of language: What is it, who has it, and how did it evolve? Science 298(5598). 1569–1579. DOI: https://doi.org/10.1126/science.298.5598.1569 

  61. Heck, Fabian & Gereon Müller. 2006. Extremely local optimization. In Erin Bainbridge & Brian Agbayani (eds.), Proceedings of the 34th Western Conference on Linguistics 17. 170–182. Fresno, CA: California State University. 

  62. Heim, Irene & Angelika Kratzer. 1998. Semantics in generative grammar. Malden, MA: Blackwell. 

  63. Hellan, Lars. 1986. The headedness of NPs in Norwegian. In Pieter Muysken & Henk van Riemsdijk (eds.), Features and projections, 89–122. Berlin: de Gruyter Mouton. DOI: https://doi.org/10.1515/9783110871661-005 

  64. Hewson, John. 1991. Determiners as heads. Cognitive Linguistics 2(4). 317–338. DOI: https://doi.org/10.1515/cogl.1991.2.4.317 

  65. Hudson, Richard. 1987. Zwicky on heads. Journal of Linguistics 23(1). 109–132. DOI: https://doi.org/10.1017/S0022226700011051 

  66. Hudson, Richard. 1990. English word grammar. Oxford: Basil Blackwell. 

  67. Jackendoff, Ray. 1977. X-bar syntax: A study of phrase structure. Cambridge, MA: MIT Press. 

  68. Jacobs, Joachim. 1994. Das lexikalische Fundament der Unterscheidung von fakultativen und obligatorischen Ergänzungen [The lexical foundation of the distinction between optional and obligatory arguments]. Zeitschrift für germanistische Linguistik 22(3). 284–319. DOI: https://doi.org/10.1515/zfgl.1994.22.3.284 

  69. Karlsson, Fred. 2007. Constraints on multiple initial embedding of clauses. International Journal of Corpus Linguistics 12(1). 107–118. DOI: https://doi.org/10.1075/ijcl.12.1.07kar 

  70. Karnowski, Paweł & Jürgen Pafel. 2004. A topological schema for noun phrases in German. In Gereon Müller, Lutz Gunkel & Gisela Zifonun (eds.), Explorations in nominal inflection, 161–188. Berlin: Mouton de Gruyter. DOI: https://doi.org/10.1515/9783110197501.161 

  71. Keenan, Edward & Bernard Comrie. 1977. Noun phrase accessibility and universal grammar. Linguistic Inquiry 8(1). 63–99. 

  72. Kim, Jong-Bok. 2020. Form and function mapping in English syntax: A construction grammar approach. Edinburgh: Edinburgh University Press. To appear. 

  73. Kobele, Gregory M. & Malte Zimmermann. 2012. Quantification in German. In Edward Keenan & Denis Paperno (eds.), Handbook of quantifiers in natural language, 227–283. Berlin: Springer. DOI: https://doi.org/10.1007/978-94-007-2681-9_5 

  74. Koenig, Jean-Pierre. 1999. Lexical relations. Stanford, CA: CSLI Publications. 

  75. Kratzer, Angelika. 1996. Severing the external argument from its verb. In Johan Rooryck & Laurie Zaring (eds.), Phrase structure and the lexicon, 109–137. Dordrecht: Springer. DOI: https://doi.org/10.1007/978-94-015-8617-7_5 

  76. Krifka, Manfred. 1996. Pragmatic strengthening in plural predications and donkey sentences. In Teresa Galloway & Justin Spence (eds.), Proceedings of the 6th Semantics and Linguistic Theory, 136–153. Ithaca, NY: Cornell University. DOI: https://doi.org/10.3765/salt.v6i0.2769 

  77. Longobardi, Giuseppe. 1994. Reference and proper names: A theory of N-movement in syntax and logical form. Linguistic Inquiry 25(4). 609–665. 

  78. Lühr, Rosemarie. 1991. Adjazenz in komplexen Nominalphrasen [Adjacency in complex NPs]. In Gisbert Fanselow & Sascha W. Felix (eds.), Strukturen und Merkmale syntaktischer Kategorien, 33–50. Tübingen: Narr. 

  79. Lyons, John. 1977. Semantics, vol. 2. Cambridge: Cambridge University Press. DOI: https://doi.org/10.1017/CBO9780511620614 

  80. Machicao y Priemer, Antonio. 2017. NP-arguments in NPs: An analysis of German and Spanish noun phrases in Head-Driven Phrase Structure Grammar. Berlin: Humboldt-Universität zu Berlin dissertation. DOI: https://doi.org/10.18452/20109 

  81. Machicao y Priemer, Antonio. 2018a. Konstituententest [Constituency test]. In Stefan Schierholz & Pál Uzonyi (eds.), Grammatik: Syntax (Wörterbücher zur Sprach- und Kommunikationswissenschaft 1.2), Berlin: De Gruyter. 

  82. Machicao y Priemer, Antonio. 2018b. Kopf [Head]. In Stefan Schierholz & Pál Uzonyi (eds.), Grammatik: Syntax (Wörterbücher zur Sprach- und Kommunikationswissenschaft 1.2), Berlin: De Gruyter. 

  83. Machicao y Priemer, Antonio & Paola Fritz-Huechante. 2018. Korean and Spanish psych-verbs: Interaction of case, theta-roles, linearization, and event structure in HPSG. In Stefan Müller & Frank Richter (eds.), The 25th International Conference on Head-Driven Phrase Structure Grammar, University of Tokyo, 155–175. Stanford, CA: CSLI Publications. 

  84. Manning, Christopher D. & Ivan Sag. 1998. Argument structure, valence, and binding. Nordic Journal of Linguistics 21(2). 107–144. DOI: https://doi.org/10.1017/S0332586500004236 

  85. Mauner, Gail & Jean-Pierre Koenig. 1999. Lexical encoding of event participant information. Brain and Language 68(1–2). 178–184. DOI: https://doi.org/10.1006/brln.1999.2096 

  86. Meurers, Detmar. 1999. Raising spirits (and assigning them case). Groninger Arbeiten zur Germanistischen Linguistik 43). 173–226. 

  87. Müller, Stefan. 1999. Deutsche Syntax deklarativ: Head-Driven Phrase Structure Grammar für das Deutsche [German syntax declarative: Head-Driven Phrase Structure Grammar for German]. Tübingen: Max Niemeyer. DOI: https://doi.org/10.1515/9783110915990 

  88. Müller, Stefan. 2003. Object-to-subject-raising and lexical rule: An analysis of the German passive. In Stefan Müller (ed.), The 10th International Conference on Head-Driven Phrase Structure Grammar, Michigan State University, East Lansing, 278–297. Stanford, CA: CSLI Publications. 

  89. Müller, Stefan. 2013. Head-Driven Phrase Structure Grammar: Eine Einführung [Head-Driven Phrase Structure Grammar: An introduction]. Tübingen: Stauffenburg. 3rd edn. 

  90. Müller, Stefan. 2014. Kernigkeit: Anmerkungen zur Kern-Peripherie-Unterscheidung [Core-ness: Notes on the core-periphery distinction]. In Antonio Machicao y Priemer, Andreas Nolda & Athina Sioupi (eds.), Zwischen Kern und Peripherie: Untersuchungen zu Randbereichen in Sprache und Grammatik, 25–40. Berlin: De Gruyter. DOI: https://doi.org/10.1524/9783050065335.25 

  91. Müller, Stefan. 2015a. The CoreGram Project: Theoretical linguistics, theory development and verification. Journal of Language Modelling 3(1). 21–86. DOI: https://doi.org/10.15398/jlm.v3i1.91 

  92. Müller, Stefan. 2015b. HPSG – A synopsis. In Tibor Kiss & Artemis Alexiadou (eds.), Syntax – theory and analysis: An international handbook (Handbooks of Linguistics and Communication Science 42.2), 937–973. Berlin: De Gruyter Mouton. DOI: https://doi.org/10.1515/9783110363708-004 

  93. Müller, Stefan. 2019. Grammatical theory: From transformational grammar to constraint-based approaches. Berlin: Language Science Press 3rd edn. DOI: https://doi.org/10.5281/zenodo.3364215 

  94. Müller, Stefan. 2020. Headless in Berlin: Headless (nominal) structures in Head-Driven Phrase Structure Grammar. In Horst Simon & Ulrike Freywald (eds.), Headedness and/or grammatical anarchy?, 1–49. Berlin: Language Science Press. To appear. 

  95. Müller, Stefan & Antonio Machicao y Priemer. 2019. Head-Driven Phrase Structure Grammar. In András Kertész, Edith Moravcsik & Csilla Rákosi (eds.), Current approaches to syntax – A comparative handbook, Berlin: De Gruyter Mouton. DOI: https://doi.org/10.1515/9783110540253-012 

  96. Müller, Stefan & Stephen M. Wechsler. 2014. Lexical approaches to argument structure. Theoretical Linguistics 40(1/2). 1–76. DOI: https://doi.org/10.1515/tl-2014-0001 

  97. Muysken, Pieter. 1982. Parametrizing the notion head. Journal of Linguistic Research 2(3). 5775. 

  98. Netter, Klaus. 1994. Towards a theory of functional heads: German nominal phrases. In John A. Nerbonne, Klaus Netter & Carl Pollard (eds.), German in Head-Driven Phrase Structure Grammar (CSLI Lecture Notes 46), 297–340. Stanford, CA: CSLI Publications. 

  99. Olsen, Susan. 1991. Die deutsche Nominalphrase als “Determinansphrase” [The German NP as DP]. In Susan Olsen & Gisbert Fanselow (eds.), DET, COMP und INFL: Zur Syntax funktionaler Kategorien und grammatischer Funktionen, 35–56. Berlin: de Gruyter. DOI: https://doi.org/10.1515/9783111353838 

  100. Partee, Barbara H. 1987. Noun phrase interpretation and type-shifting principles. In Jeroen Groenendijk, Dick de Jongh & Martin Stokhof (eds.), Studies in discourse representation theory and the theory of generalized quantifiers, 115–143. Dordrecht: Foris Publications. DOI: https://doi.org/10.1515/9783112420027-006 

  101. Partee, Barbara H. 1997. Genitives: A case study. In Johan van Bethem & Alice ter Meulen (eds.), Handbook of logic and linguistics, 464–470. Amsterdam: Elsevier. Appendix to Theo M.V. Janssen, ‘Compositionality’. 

  102. Payne, John & Rodney D. Huddleston. 2002. Nouns and noun phrases. In Rodney D. Huddleston & Geoffrey K. Pullum (eds.), The Cambridge grammar of the English language, 323–523. Cambridge: Cambridge University Press. DOI: https://doi.org/10.1017/9781316423530.006 

  103. Pollard, Carl & Ivan Sag. 1987. Information-based syntax and semantics. Volume 1: Fundamentals. Stanford, CA: CSLI Publications. 

  104. Pollard, Carl & Ivan Sag. 1994. Head-Driven Phrase Structure Grammar. Chicago: University of Chicago Press. 

  105. Przepiorkówski, Adam. 1999. Case Assignment and the complement/adjunct dichotomy: A non-configurational constraint-based approach. Tübingen: Eberhard-Karls-Universität dissertation. 

  106. Pullum, Geoffrey K. & Barbara C. Scholz. 2001. On the distinction between model-theoretic and generative-enumerative syntactic frameworks. In Philippe de Groote, Glyn Morrill & Christian Retoré (eds.), 4th Logical Aspects of Computational Linguistics, 17–43. Le Croisic, France: Springer. DOI: https://doi.org/10.1007/3-540-48199-0_2 

  107. Radford, Andrew. 2000. NP shells. Essex Research Reports in Linguistics 33. 2–20. 

  108. Richards, Marc. 2015. Minimalism. In Tibor Kiss & Artemis Alexiadou (eds.), Syntax – theory and analysis: An international handbook (Handbooks of Linguistics and Communication Science 42.2), 803–839. Berlin: De Gruyter Mouton. DOI: https://doi.org/10.1515/9783110363708 

  109. Richter, Frank. 2007. Closer to the truth: A new model theory for HPSG. In James Rogers & Stephan Kepser (eds.), Model-theoretic syntax at 10: Proceedings of the ESSLLI 2007, 101–110. Dublin: Trinity College Dublin. 

  110. Rizzi, Luigi. 1990. Relativized minimality. Cambridge, MA: MIT Press. 

  111. Roeper, Tom W. & William Snyder. 2005. Language learnability and the forms of recursion. In Anna Maria Di Sciullo (ed.), UG and external systems: Language, brain and comutation, 155–169. Amsterdam: John Benjamins. DOI: https://doi.org/10.1075/la.75.10roe 

  112. Sadler, Louisa & Doug J. Arnold. 1994. Prenominal adjectives and the phrasal/lexical distinction. Journal of Linguistics 30(1). 187–226. DOI: https://doi.org/10.1017/S0022226700016224 

  113. Sag, Ivan. 1997. English relative clause constructions. Journal of Linguistics 33(2). 431–483. DOI: https://doi.org/10.1017/S002222679700652X 

  114. Sag, Ivan. 2007. Remarks on locality. In Stefan Müller (ed.), The 14th International Conference on Head-Driven Phrase Structure Grammar, Stanford Department of Linguistics and CSLI’s LinGO Lab, 394–414. Stanford, CA: CSLI Publications. 

  115. Sag, Ivan. 2012. Sign-based construction grammar: An informal synopsis. In Hans C. Boas & Ivan Sag (eds.), Sign-based construction grammar, 69–202. Stanford, CA: CSLI Publications. 

  116. Sag, Ivan & Carl Pollard. 1991. An integrated theory of complement control. Language 67(1). 63–113. DOI: https://doi.org/10.2307/415539 

  117. Sag, Ivan, Rui Chaves, Anne Abeillé, Bruno Estigarribia, Frank Van Eynde, Dan Flickinger, Paul Kay, Laura Michaelis-Cummings, Stefan Müller, Geoffrey Pullum & Tom Wasow. 2020. Lessons from the English auxiliary system. Journal of Linguistics 56(1). 87–155. DOI: https://doi.org/10.1017/S002222671800052X 

  118. Sag, Ivan & Thomas Wasow. 2011. Performance-compatible competence grammar. In Robert D. Borsley & Kersti Borjars (eds.), Non-transformational syntax: Formal and explicit models of grammar, 359–377. Oxford: Wiley-Blackwell. DOI: https://doi.org/10.1002/9781444395037.ch10 

  119. Sag, Ivan A., Thomas Wasow & Emily M. Bender. 2003. Syntactic theory: A formal introduction. Stanford, CA: CSLI Publications 2nd edn. 

  120. Salzmann, Martin. 2020. The NP vs. DP debate: Why previous arguments are inconclusive and what a good argument could look like: Evidence from agreement with hybrid nouns. Glossa: A Journal of General Linguistics 5(1). 83. 1–46. DOI: https://doi.org/10.5334/gjgl.1123 

  121. Schumacher, Helmut, Jacqueline Kubczak, Renate Schmidt & Vera de Ruiter. 2004. VALBU – Valenzwörterbuch deutscher Verben [Valency dictionary for German verbs]. Tübingen: Gunter Narr. 

  122. Smith, George. 2003. On the distribution of the genitive attribute and its prepositional counterpart in Modern Standard German. In Sudha Arunachalam, Elsi Kaiser, Ian Ross, Tara Sanchez & Alexander Williams (eds.), Proceedings of the 25th Annual Penn Linguistics Colloquium 8(1). 173–186. Philadelphia, PA: University of Pennsylvania. 

  123. Speas, Margaret. 1990. Phrase structure in natural language. Dordrecht: Kluwer Academic Publishers. DOI: https://doi.org/10.1007/978-94-009-2045-3 

  124. Sportiche, Dominique. 2005. Division of labor between merge and move: Strict locality of selection and apparent reconstruction paradoxes. Ms. https://ling.auf.net/lingbuzz/000163 

  125. Steedman, Mark. 1989. Constituency and coordination in a combinatory grammar. In Mark R. Baltin & Anthony S. Kroch (eds.), Alternative conceptions of phrase structure, 201–231. Chicago: University of Chicago Press. 

  126. Steedman, Mark. 2000. The syntactic process. Cambridge, MA: MIT Press. DOI: https://doi.org/10.7551/mitpress/6591.001.0001 

  127. Sternefeld, Wolfgang. 2009. Syntax: Eine morphologisch motivierte generative Beschreibung des Deutschen [Syntax: A morphologically motivated generative description of German], vol. 2. Tübingen: Stauffenburg 3rd edn. 

  128. Sternefeld, Wolfgang. 2015. Syntax: Eine morphologisch motivierte generative Beschreibung des Deutschen [Syntax: A morphologically motivated generative description of German], vol. 1. Tübingen: Stauffenburg 4th edn. 

  129. Szabolcsi, Anna. 1983. The possessor that ran away from home. The Linguistic Review 3(1). 89–102. DOI: https://doi.org/10.1515/tlir.1983.3.1.89 

  130. Szabolcsi, Anna. 1994. The noun phrase. In Ferenc Kiefer & Katalin É. Kiss (eds.), The syntactic structure of Hungarian, 179–274. San Diego: Academic Press. DOI: https://doi.org/10.1163/9789004373174_004 

  131. Van Eynde, Frank. 1998. The immediate dominance schemata of HPSG: A deconstruction and a reconstruction. In Peter-Arno Coppen, Hans van Halteren & Lisanne Teunissen (eds.), Computational Linguistics in the Netherlands 1997: Selected papers from the 8th CLIN meeting, 119–134. Amsterdam: Rodopi. 

  132. Van Eynde, Frank. 2006. NP-internal agreement and the structure of the noun phrase. Journal of Linguistics 42(1). 139–186. DOI: https://doi.org/10.1017/S0022226705003713 

  133. Van Eynde, Frank. 2015. Predicative constructions: From the Fregean to a Montagovian treatment. Stanford, CA: CSLI Publications. 

  134. Van Eynde, Frank. 2020a. Agreement, disagreement and the NP vs. DP debate. Glossa: A Journal of General Linguistics 5(1). 65. 1–23. DOI: https://doi.org/10.5334/gjgl.1119 

  135. Van Eynde, Frank. 2020b. Nominal structures. In Stefan Müller, Anne Abeillé, Robert D. Borsley & Jean-Pierre Koenig (eds.), Head-Driven Phrase Structure Grammar: The handbook, Berlin: Language Science Press. To appear. 

  136. Van Langendonck, Willy. 1994. Determiners as heads? Cognitive Linguistics 5(3). 243–260. DOI: https://doi.org/10.1515/cogl.1994.5.3.243 

  137. van Riemsdijk, Henk. 1998. Categorial feature magnetism: The endocentricity and distribution of projections. Journal of Comparative Germanic Linguistics 2(1). 1–48. DOI: https://doi.org/10.1023/A:1009763305416 

  138. Vater, Heinz. 1991. Determinantien in der DP [Determiners in the DP]. In Susan Olsen & Gisbert Fanselow (eds.), DET, COMP und INFL: Zur Syntax funktionaler Kategorien und grammatischer Funktionen, 15–34. Berlin: de Gruyter. DOI: https://doi.org/10.1515/9783111353838.15 

  139. Vennemann, Theo & Ray Harlow. 1977. Categorial grammar and consistent basic VX serialization. Theoretical Linguistics 4(1–3). 227–254. DOI: https://doi.org/10.1515/thli.1977.4.1-3.227 

  140. Verhoeven, Elisabeth & Nico Lehmann. 2018. Self-embedding and complexity in oral registers. Glossa: A Journal of General Linguistics 3(1). 1–30. DOI: https://doi.org/10.5334/gjgl.592 

  141. Yoon, Youngeun. 1996. Total and partial predicates and the weak and strong interpretations. Natural Language Semantics 4(3). 217–236. DOI: https://doi.org/10.1007/BF00372820 

  142. Zifonun, Gisela. 2003. Dem Vater sein Hut: Der Charme des Substandards und wie wir ihm gerecht werden [The father his hat: The charm of the substandard and how we live up to it]. Deutsche Sprache 31(2). 97–126. DOI: https://doi.org/10.37307/j.1868-775X.2003.02.02 

  143. Zwicky, Arnold M. 1987. Suppressing the Zs. Journal of Linguistics 23(1). 133–148. DOI: https://doi.org/10.1017/S0022226700011063 

comments powered by Disqus