1 Introduction

How much meaning can a morpheme have? The task of segmenting a whole language into the pieces that go into the compositional semantics—of finding the lexical items—can seem hopeless. Null morphemes and contextual allomorphy make it difficult to know what the parts that make up a sentence are, and the potential for ambiguity threatens to make the task of doing semantics impossible, as much for the linguist as for the learner—without some principle constraining the decomposition, for example, some limit on how much semantic content can be expressed by a single morpheme. In this paper, we propose a principle limiting how much meaning a morpheme can have. In short: it can have no more than it needs.

The goal of the paper is to give this suggestion some formal teeth, in the form of a semantic principle. We do this in a domain where both the semantics and the morphology are interesting: English comparatives and superlatives. We use our principle first to deduce a syntax, and from this a morphological analysis, and extend it to explain the facts about the typology of comparative morphology discovered by Bobaljik (2012).

Comparatives and superlatives are expressions like more death-defying, the most electric, more coffee, the most sugar. In English, as in some other languages, they have the interesting morphological property that, for a handful of adjectives, the same meanings as more X and most X are expressed by the one-word forms X-er and X-est, and cannot be expressed by the two-word forms (taller/tallest, *more tall/*most tall).1

Moreover, comparatives present a striking domain for compositional semantics: apparently simple propositions are expressed only by sentences with (in many cases) almost tortuous amounts of grammatical clutter. For example, the truth conditions of the sentences in (1a) and (2a) are captured roughly by the logical representations in (1b) and (2b): both express apparently simple relational thoughts. Why, despite this apparent semantic simplicity, must so many formal parts be recruited (-er, more, than, and so on), and combined in just the right way, to express such thoughts?

    1. (1)
    1. a.
    1. Mary is smarter than Bill is.
    1.  
    1. b.
    1. > smart(m,b)
    1. (2)
    1. a.
    1. Mary is more intelligent than Bill is.
    1.  
    1. b.
    1. > intelligent(m,b)

How exactly the formal parts of (1a) and (2a) correspond to the the semantic parts is yet another question. A naive approach to identifying the semantic atoms would assume a one-to-one correspondence with chunks of the string that are “easy” to identify. This would result in some strange conclusions. The relation between the meaning of tall and taller is the same as that between extreme and more extreme, yet the lack of a phonological boundary between tall and -er could possibly make taller look like a single item. The semantic relation between good and better is again the same, even though good seems to have been replaced by bett; and the relation between bad and worse is the same again, even though the phones have changed completely. Although there are some limits to what morphology can do to distort the form–meaning correspondence, the speech stream does not overtly mark each semantic atom, and, as a result, the process of arriving at a semantic decomposition seems to need to be constrained.

The formal constraint we offer is simple and aggressive. It is called the No Containment Condition (NCC). The NCC says that no morpheme’s meaning can contain another’s (in a more precise way than this). If worse means bad plus some other bit of meaning, then it must be that it is bad semantically composed with some other morpheme. With this principle, we can take what we know about the meaning of a sentence, figure out much about the parts that compose that meaning, and from there deduce many things about the syntax and morphology of the language.

We will deduce a syntactic structure from the basic semantic facts about comparatives and superlatives. We use this syntactic structure, coupled with a morphological analysis, to explain typological generalizations about comparatives and superlatives across languages, discovered by Bobaljik (2012)—an analysis which fixes issues left open by Bobaljik’s original proposal. In order to do this, we will see that it is key for us (as, presumably, for the language acquisition device) to invoke constraints on what the basic meaningful pieces can be. Hence, the proposed NCC.

1.1 Compositionality and the Ф-domain

What exactly is the problem with figuring out the meaningful parts of a language? Why is morphology relevant to semantics? When we investigate the composition of items into meanings, we need to know what the items are that enter into the composition. Yet, although we may have a rough sense of what the meaning-bearing units are, we cannot directly identify them just from the surface pronunciation of an utterance.

Null heads give rise to one example of a non-identifiability problem. According to one theory, for example, the interpretation of a sentence like (3a) is an existential statement about an event in which someone named Gena was pushed in the past, which bears the primitive Agent relation to Cheburashka, as in (3b) (see Kratzer 2008). The existential quantification is introduced by a phonologically null Aspect head (Hacquard 2006), and the Agent relation by a null v head.

    1. (3)
    1. a.
    1. Cheburashka pushed Gena.
    1.  
    1. b.
    1. e[Agent(e, c) & push(e, g) & Past(e)]

Yet, there is a much deeper non-identifiability problem lurking. It is one thing to say that there may be elements in the semantic composition above and beyond those that are evident from the surface speech signal. In fact, on serious reflection, very little is “evident” from the signal at all. In (3a), pushed seems to be a unit of some kind, one that we would pre-theoretically call a word. But why do we think this? There are, after all, clearly two different phonological chunks that can be found recurring elsewhere: push [pʊʃ] and -ed [t]. Where should we even start looking for the atoms of meaning?

The so-called “non-lexicalist” take on this issue is that words do not correspond to single lexical entries, nor are they units with special status in the syntax or semantics. The pre-theoretic unit “word,” in practice delimited very informally by speakers’ intuitions and by conventions about where to put spaces in text, reflects nothing more than a collection of meta-linguistic intuitions about certain phonological or syntactic domains. For example, an utterance (at least in English) will be a sequence of stress culminativity domains: prosodic units in which there must be exactly one main stress. It will also have a syntactic constituent structure. Under a non-lexicalist approach, there is nothing beyond phonological or syntactic domains like this which must necessarily correspond to a pre-theoretic word.

Furthermore, the non-lexicalist view is that phonological and syntactic domains are computed, not primitive. For example, a stress culminativity domain might be computed on the basis of what phonological material corresponds to the X0 structures in the syntax, despite each possibly being built up of multiple lexical items by head movement. In an alternative approach (Marvin 2002; Compton & Pittman 2010), these domains correspond to syntactic phases. Both are consistent with Distributed Morphology (DM: Halle & Marantz 1993). We adopt DM here, and we take the first option: by default, a single X0 will be encapsulated by strong phonological boundaries; these boundaries can be weakened by affixation operations, including head movement.

This is important in the case of English comparatives and superlatives because they come in two kinds: analytic, like more intelligent, most intelligent; and synthetic, like smart-er, smart-est. The crucial difference is that the analytic comparative has a stronger boundary than the synthetic comparative: it has two primary stress domains, while the synthetic has one, and, for speakers of North American English, the [t] flapping rule is blocked despite support from the segmental context (for example, mo[r#t]omatoes does not undergo flapping, unlike post-mo[rɾ]em, which lacks such a boundary). It is presumably because of this strong boundary that English orthographic conventions require a space in analytic comparatives and none in synthetic comparatives.

In spite of their phonological differences, comparatives show evidence of being semantically complex no matter what. That is, assuming that the form taller makes the same compositional contribution in (4a) and (5a), it cannot be analyzed as expressing a simple relation between two entities, as in (4b). Rather, it must involve at least two compositionally active parts—contributing tall and >—to flexibly allow for interpretations like that in (5b).

    1. (4)
    1. a.
    1. Mary is taller than John is.
    1.  
    1. b.
    1. > tall(m,j)
    1. (5)
    1. a.
    1. Mary is taller than John is wide.
    1.  
    1. b.
    1. tall(m) > wide(j)

Such patterns (among many others) suggest an analysis where comparatives are semantically composed. The resulting syntactic structure will surface with either one or two of the phonological domains that block flapping and induce primary stress—units which, to be neutral, we will call Ф-domains. In taking this kind of approach, we follow Embick & Noyer (2001) and Bobaljik (2012); in deducing the syntactic structure, we use the NCC as a constraint on what the pieces can be.

We do not pretend that our proposal should have scope over every unresolved question about the limits of semantic decomposition. In particular, we sidestep the long and storied history of questions about whether open-class items like bachelor and kill are lexically atomic (see discussion in Katz & J. A. Fodor 1963; J. D. Fodor 1970; Dowty 1979; Pustejovsky 1995; J. A. Fodor & Lepore 1998; Levin & Rappaport Hovav 2005, among others).2 Instead, we take the relatively novel tack of restricting our attention to the semantic combination of functional morphemes. Our particular interest is in the combination of functional elements that underlies expressions like most and more (see also Szabolcsi 2012).

1.2 Morphological typology

Starting from a proposed syntactic structure for comparative and superlative constructions, Bobaljik (2012) uses morphological arguments to explain two different kinds of apparent typological gaps in languages that, like English, have synthetic comparative and superlative forms.

The first states that any language which has synthetic comparatives also has synthetic superlatives. In fact, English and every other language Bobaljik studied seems to comform to a stronger generalization: there are no individual adjectives for which the superlative is synthetic, but the comparative is analytic (more frood, *frood-er, but frood-est). We state this stronger version of Bobaljik’s Synthetic Superlative Generalization as in (6).

    1. (6)
    2.  
    1. Synthetic Superlative Generalization (SSG)
    2. If an adjective has a synthetic superlative form, then it has a synthetic comparative form.

The second typological fact is the Comparative–Superlative Generalization, (7), which concerns suppletive root allomorphy. We see ABC patterns as in Latin bon-us, ‘good,’ which has a default stem form, bon (A), a different form in the comparative, mel-ior (B), and yet a third form in the superlative, opt-imus (C). We also see ABB patterns as in Welsh mawr (A), ‘big,’ mwy (B), ‘bigger,’ mwy-af (B), ‘biggest.’ However, no adjective in any language shows a pattern like bon-us–mopt-ior–bon-imus (*ABA) or bon-us–bon-ior–ompt-imus (*AAB).

    1. (7)
    2.  
    1. Comparative–Superlative Generalization (CSG)
    2. An adjective root has the default form in the comparative if and only if it has the default form in the superlative.

Bobaljik attempts to explain these patterns using a hypothesis about the grammar of comparatives and superlatives, the Containment Hypothesis, (8).

    1. (8)
    2.  
    1. Containment Hypothesis
    2. The representation of the superlative properly contains that of the comparative.

What this means is that the parts of the syntactic structure that are relevant to comparative morphology are all there in the syntactic structure for the superlative. So, for example, if the syntactic structure for a comparative is nested within the superlative, and the syntactic structure for a comparative triggers some affixation operation whenever it is present, then it will be there to trigger that operation in a superlative too. We will see a different example of containment when we come to our proposed syntactic structure.

The intuition is clear enough: both the SSG and the CSG point to a kind of relation between the comparative and superlative forms, and in particular an asymmetric one. There are languages that have synthetic comparatives but no synthetic superlatives, like Ossetian (bærzond, ‘high,’ bærzonddær, ‘higher,’ innul bærzond, ‘highest’), but not the other way around. And even in a language like English, where it is not at all obvious that the superlative -est has anything synchronically to do with the comparative -er, the claim is nevertheless that the superlative has all the same triggers for grammatical rules as the comparative, but not vice versa.

This is the syntactic structure Bobaljik proposes for superlatives, which satisfies (8).

On the basis of the NCC, we propose a different syntactic structure that also satisfies (8), first as in (10).3 The morphological analysis we propose based on this structure solves problems left open by Bobaljik’s analysis. We revise this syntax in section 4.3 to account for other facts, but the core of the analysis, that CMPR and SUP are together in a specifier rather than in a nesting relationship, remains the same.

2 Comparatives: Syntax, morphology, typology

2.1 Affixation operations and the SSG

What is a synthetic superlative form? In our terms, it is a form where the phonological reflex of the head SUP appears in the same Ф-domain as that of the root. Similarly, a synthetic comparative is one where CMPR appears in the same Ф-domain as the root. We follow Bobaljik in assuming that two heads can only appear in the same Ф-domain because of morphological operations, and that restrictions on those operations make the SSG a necessary consequence of the syntax of superlatives. For empirical reasons, we differ from Bobaljik in that we include local dislocation in our toolbox of morphological operations. This lets in a derivation that would violate the SSG under Bobaljik’s syntax.

Bobaljik considers two different affixation operations, head movement (Baker 1985; Travis 1984) and lowering (Chomsky 1957; Bobaljik 1995; Embick & Noyer 2001), which give different derivations for superlatives. If we imagine a derivation with only head movement, as in (11), we can show that there is no way to violate the SSG.

Since a synthetic superlative form is any form where the phonological reflex of the head SUP appears in the same Ф-domain as that of the root, there are two ways that violating the SSG would be hypothetically possible. One is if there were an alternate derivation that combined SUP and the adjective directly, skipping CMPR. (We use “the adjective” to refer to an affixed root–a complex.) But head movement is local, and it is not possible to skip over intervening heads or traces and affix the adjective directly to SUP. This rules out any derivation other than (11) for putting the adjective and SUP in the same Ф-domain.

The other way of violating the SSG would be if a grammar generated synthetic superlatives (the adjective and SUP (or CMPR and SUP) are combined when adjective, CMPR, and SUP are all present in the syntax) but not synthetic comparatives (the adjective and CMPR are not combined when SUP is absent). That would mean that the step in (11) that combines the adjective with CMPR is triggered specifically when SUP is present in the syntax. Head movement cannot be triggered by items apart from the two that it combines; it is not possible for affixation of a with CMPR in (11) to be triggered by SUP. Thus, this kind of SSG violation is ruled out if the only operation is head movement as well.

If we imagine a derivation with only lowering, it is the mirror image of that with only head movement. Lowering has been less extensively studied, but subjecting a derivation like (12) to certain natural restrictions would similarly give rise to the SSG.

Assuming that lowering is subject to the same principles as head movement, except that it outputs a structure with the label of the lower object rather than the higher one, then, again, the only way to put SUP and the adjective together in the same Ф-domain is the derivation in (12). The fact that the output of affixing SUP to CMPR is labeled CMPR means that the second mode of violating the SSG (as discussed for head movement) is ruled out, because SUP would only be local enough to the adjective if it lowered to CMPR, and it could only then affix to the adjective if CMPR was affixed to the adjective independently.

If the only possible affixation operations were head movement or lowering, then there would be no problem for the SSG. For empirical reasons that we will discuss in a moment, however, we propose that another operation, local dislocation, is allowed, and local dislocation would actually permit a derivation like (13). Applying head movement in the first step results in a structure labeled SUP. Applying local dislocation to the resultant structure lets in a violation of the SSG of the second type: it gives the grammar a way to target CMPR+SUP for affixation (synthetic superlative) which would not imply that CMPR alone is an affixation target (synthetic comparative).

Local dislocation is triggered under linear adjacency, combining a head with one adjacent on its immediate right or left.4 A clear example is the Latin conjunction -que in (14), which affixes itself into the phonological domain of whatever head would otherwise be linearized to its immediate right.

    1. (14)
    2.  
    1. a.
    2.  
    1. Amemus
    2. love.1PL.SBJV
    1. rumores-que
    2. rumors-and
    1. senum
    2. old.men.GEN.PL
    1. aestimemus
    2. value.1PL.SBJV
    1. unius
    2. one.GEN
    1. assis
    2. penny.GEN
    1. ‘Let us love and value the rumors of the old men at one penny.’
    1.  
    2.  
    1. b.
    2.  

Moreover, there is direct evidence that local dislocation is involved in synthetic comparative and superlative formation (Embick & Noyer 2001). Unlike lowering, local dislocation can be blocked by adjuncts. In English affix-hopping, T lowers as though the adjunct never were transparent (John never eats lamb shanks; Bobaljik 1995; Embick & Noyer 2001), but is blocked by the non-adjunct not (we get do-support in John does not eat lamb shanks). Yet, adjuncts block synthetic comparative and superlative formation; the facts for superlatives are shown in (15)a-c. Assuming that CMPR and SUP first affix to each other to form a complex affix, (15d) illustrates the blocking effect.

    1. (15)
    1. a.
    1. Mary is the smartest woman.
    1.  
    1. b.
    1. *Mary is the amazingly smartest woman.
    1.  
    1. c.
    1. Mary is the most amazingly smart woman.
    1.  
    1. d.
    1. [CMPR + SUP [ ADJUNCT [ + CMPR + SUP

The same can be demonstrated with comparatives, if the right cautions are taken. The comparative sentence corresponding to (15b), (16b), is bad under the interpretation, ‘the degree to which Mary is amazingly smart is greater than the degree to which Abdellah is.’ Under the interpretation ‘the degree to which Mary is smarter than Abdellah is amazing,’ on the other hand, (16b) is fine. In this case, the adjunct amazingly modifies the whole degree complex, which suggests that it is structurally higher, as in (17).5

    1. (16)
    1. a.
    1. Mary is smarter than Abdellah.
    1.  
    1. b.
    1. *Mary is amazingly smarter than Abdellah.
    1.  
    1. c.
    1. Mary is more amazingly smart than Abdellah.
    1. (17)
    1. [ ADJUNCT [ CMPR [ ROOTROOT + CMPR

2.2 Locality of suppletion triggers

If the derivation in (13) is possible, then a problem also arises with the CSG, which concerns suppletion. The main part of DM theory that governs suppletion is the theory of vocabulary insertion. Treated as vocabulary insertion rules, the (possibly context-dependent) specification of how roots are pronounced will yield various patterns of suppletion, as in (18).

    1. (18)
    1. a.
    1. AAA (English)
    1. TALL → tɔl
    1. tɔl, tɔlVɹ ( + CMPR), tɔlVɹ ( + CMPR + SUP)
    1. tall, taller, tallest
    1.  
    1. b.
    1. ABB (Persian)
    1. GOOD
    2.  
    1. beh
    2. xub
    1. / — CMPR
    2.  
    1. xub, behtær (+ CMPR), behtærin (+ CMPR + SUP)
    1.  
    2.  
    1. c.
    2.  
    1. ABC (Latin)
    1. GOOD
    2.  
    3.  
    1. opt
    2. mel
    3. bon
    1. / — CMPR + SUP
    2. / — CMPR
    3. bon
    1. bon, melior (+ CMPR), optimus (+ CMPR + SUP)

Root suppletion needs to take place within a single Ф-domain, and it is subject to locality restrictions: in general, only linearly adjacent heads can trigger suppletion (Adger, Bejar & Harbour 2003). The fundamental assumption of the accounts of the CSG in Bobaljik (2012) and Bobaljik & Wurmbrand (2013) is that SUP is not immediately adjacent to the root. These analyses develop mechanisms by which this head, though normally too far away from the root to trigger root suppletion, can exceptionally do so just when CMPR is itself a trigger. This makes *AAB impossible.6

However, the derivation in (13) makes it possible for SUP to be adjacent to CMPR both in a linear sense (the head SUP is linearly adjacent to the root; this is actually the case in Finnish, see Bobaljik 2012) and in a structural sense (the entire lowered affixal complex is labeled SUP). Therefore, root suppletion triggered by SUP is allowed if (13) is, and *AAB cannot be ruled out.

The possibility of (13) is also a problem for excluding the pattern *ABA. It can be excluded if the only way to affix SUP to the adjective is to bring it along with CMPR (provided that SUP cannot block the suppletion triggered by CMPR). However, (13) violates the assumption that we bring SUP along with CMPR, instead saying that we bring CMPR along with SUP.

2.3 Our proposal

We propose a different syntax, which we use to develop an alternate proposal explaining the CSG and the SSG. This is repeated in (19). In particular, we propose that the SSG and the CSG arise because CMPRP is a specifier, a structural configuration little-studied in DM approaches to affixation.

We propose restrictions on affixation operations and on vocabulary insertion lists that result from specifiers being treated representationally differently in the morphology (section 3.3). In section 4.3, we then revise this syntax to support a semantic analysis of much. That analysis, combined with the restrictions on affixation and vocabulary insertion, makes new predictions about morphological typology. We first turn to the details of our analysis of comparatives and superlatives, starting from the semantics.

3 Applying the NCC: The case of superlatives

3.1 Semantics

Although our analysis of the typological patterns in comparative and superlative morphology differs from Bobaljik’s, it still rests on the idea that superlative constructions syntactically contain the comparative. Why should such a containment relation exist? Bobaljik suspects that his containment hypothesis is an instance of some universal constraint on the complexity of meaning that can be packaged into a single morpheme.

This conjecture can be made more precise. Suppose that it reflects a constraint on grammars, such that for any two lexical items’ interpretations m1 and m2, neither can contain the other. We define containment as in (20), where Q is the set of (universally available) composition rules, and D the set of possible interpretations of individual heads. We assume that Q contains just those rules that our best semantic theory tells us are needed to explain human semantic competence; for present purposes, it includes the rules listed in the Heim and Kratzer (1998) textbook (see Pietroski 2005 for an alternative set).

    1. (20)
    2.  
    1. Containment
    2. x1 is contained within x3 if there is some composition rule qQ and some x2D such that q(x1,x2) = x3.

The condition we propose is the No Containment Condition, (21). A hypothesis space constrained by the NCC only contains a semantic representation x3 as a viable candidate for the interpretation of a lexical item m if x3 could not have been constructed out of two other semantic representations, x1 and x2, by some composition rule.

    1. (21)
    2.  
    1. No Containment Condition (NCC)
    2. No head’s semantic representation can contain another’s.

To demonstrate that the NCC can derive Bobaljik’s containment hypothesis, we set aside many questions about the finer details of the semantics of comparatives and superlatives; such debates involve quite subtle judgments about sentences of much greater complexity than those that we discuss (this is also Bobaljik’s strategy; see von Fintel 1999; Heim 2000; Bhatt & Pancheva 2004; Hackl 2009, among others, for exploration of these complexities).

Bobaljik points out that, intuitively, the interpretation of superlative sentences involves a proper superset of the interpretive components of comparative sentences: (22a) means something like ‘Mary’s height is greater than Sue’s height’, and (22b) means something like ‘Mary’s height is greater than the height of all relevant others.’

    1. (22)
    1. a.
    1. Mary is taller than Sue is.
    1.  
    1. b.
    1. Mary is the tallest.

Bare bones truth-conditional representations for the sentences in (22) are given in (23), ignoring explicit reference to contexts, models, etc, and understanding the universal quantifier as ranging only over relevant entities. In (23), tall stands for the “measure function” that maps entities to their heights (Bartsch & Vennemann 1972; Kennedy 1999, among others), m stands for Mary, and s for Sue. Thus, (23) are mere formalizations of the paraphrases given above for (22).

    1. (23)
    1. a.
    1. ⟦(22a)⟧ = ⊤ iff tall(m) > tall(s)
    1.  
    1. b.
    1. ⟦(22b)⟧ = ⊤ iff ∀x[xm → tall(m) > tall(x)]

What we need is a way of understanding how the semantic contribution of -est in (23b) might have been composed out of two other meanings.

Following primarily Kennedy (1999), we assume that ⟦CMPR⟧ takes three arguments: a measure function of type ⟨e,d⟩, a degree of type d, and an individual of type e, (24).7 Throughout, we abstract away from the details of the internal composition of the than-clause that typically provides d, and forgo discussion of the distinction between phrasal and clausal comparatives (though see section 5.3).

    1. (24)
    1. CMPR⟧ = λgλdλx.g(x) > d           ⟨⟨e,d⟩, ⟨d, ⟨e,t⟩⟩⟩

One possible semantics for the superlative—one which would allow it to syntactically combine directly with the adjective and have nothing syntactically to do with CMPR—is shown in (25). This function takes two arguments: a measure function type, and an individual type. The only type-theoretic difference between (24) and (25) is that [SUP1] does not take a degree argument.8

    1. (25)
    1. SUP1⟧ = λgλx.∀y [yxg(x) > g(y)]           ⟨⟨e,d⟩, ⟨e,t⟩⟩

An alternative analysis—one that would imply that the superlative meaning is the result of syntactically combining a head SUP2 with CMPR—is as in (26). This function takes a function of the same type as [CMPR] as an argument, indicated by G , and returns a function of the same type as ⟦SUP1⟧.

    1. (26)
    1. SUP2⟧ = λ G λgλx.∀y [yx G (g)(g(y))(x)]           ⟨TYPE(⟦CMPR⟧), ⟨⟨e, d⟩, ⟨e, t⟩⟩⟩

Combining ⟦CMPR⟧ with ⟦SUP2⟧ delivers ⟦SUP1⟧. First, ⟦CMPR⟧ and ⟦SUP2⟧ combine by FA, a simplified schema for which is given in (27).

    1. (27)
    2.  
    1. Functional Application (FA)
    2. If α is a branching node, {β, γ} is the set of α’s daughters, and ⟦β⟧ is a function whose domain contains ⟦γ⟧, then ⟦α⟧ = ⟦β⟧ (⟦γ⟧).

By this definition, given two syntactic sisters, the more highly-typed expression takes the other as its argument, provided that the type of the latter matches the input type of the former. The result of the composition is the value of the function given the argument. Since ⟦SUP2⟧ is a function that takes ⟨⟨e,d⟩, ⟨d, ⟨e,t⟩⟩⟩ as an input, the type of ⟦CMPR⟧, the result is ⟦SUP2⟧ applied to ⟦CMPR⟧. The derivation is shown explicitly in (28). Following the application of a few steps of λ-conversion, the result of the composition is as in (28f), which is identical to the interpretation of SUP1 in (25).9

    1. (28)
    1. a.
    1. CMPR SUP2⟧ = ⟦SUP2⟧(⟦CMPR⟧)           FA
    1.  
    1. b.
    1. = [λ G λgλx.∀y[yx G (g)(g(y))(x)]]([λgʹλdʹλxʹ.gʹ(xʹ) > dʹ])
    1.  
    1. c.
    1. = λgλx.∀y[yx → [λgʹλdʹλxʹ.gʹ(xʹ) > dʹ](g)(g(y))(x)]
    1.  
    1. d.
    1. = λgλx.∀y[yx → [λdʹλxʹ.g(xʹ) > dʹ](g(y))(x)]
    1.  
    1. e.
    1. = λgλx.∀y[yx → [λxʹ.gʹ(xʹ) > g(y)](x)]
    1.  
    1. f.
    1. =λgλx.y[yxg(x)>g(y)]

Given the NCC and the availability of the derivation in (28), only SUP2 can coexist with CMPR. This situation with respect to containment is summarized in (29). We thus conclude that -est has the interpretation of SUP2.

    1. (29)
    1. CMPR⟧ is contained within ⟦SUP1⟧ since: FA(⟦CMPR⟧, ⟦SUP2⟧) = ⟦SUP1⟧.

An important component of this analysis is that CMPR and SUP are syntactic sisters, as in T2 in Figure 1. Only this configuration will support the function-argument relationship we have established, needed to apply FA. This is contra Bobaljik’s proposal for their syntactic relationship, which nests a CMPRP within a SUPP, as in T1.

Figure 1
Figure 1

Three options for three heads.

Could our semantics be easily modified to accommodate T1? No; not if TALL-CMPR has to be able to occur both with and without SUP. This rules out any interpretation of TALL-CMPR that takes ⟦SUP⟧ as an argument. Without SUP, ⟦TALL-CMPR⟧ would minimally have to contribute a predicate of individuals, in order to relate the degree complex and the subject. Such an interpretation would render the measure function parameter of ⟦TALL-CMPR⟧ inaccessible, and there would be no obvious way for SUP to influence the value on the right hand side of > when it was present.

Ruling out T3 on the basis of semantics is not trivial. Although our ⟦SUP⟧ takes arguments in the order λ G λgλx, nothing prevents us from re-ordering these arguments to get λgλ G λx, an analysis that would still require SUP to combine with CMPR. The lack of decisive semantic evidence here reveals a general issue with our choice of semantic formalism—there simply is no general rule for enforcing the order that functions take their arguments in. We return to this point in the conclusion.

T3 is, however, implausible on morphological grounds. There are no languages in which the comparative marker transparently contains the superlative marker, and there are many in which the superlative marker transparently contains the comparative marker (Bobaljik 2012). In light of the evidence from morphology in this case, we proceed assuming that T2 is the best analysis.

Our analysis is similar to that offered by Stateva (2003), who also posits that superlatives contain comparatives. On both accounts, SUP semantically functions to plug the degree argument of ⟦CMPR⟧; such analyses correctly predict Stateva’s observation that superlatives disallow than-clauses despite this containment relationship, (30).

    1. (30)
    1. a.
    1. *Al bought the most expensive toy than anyone else did.
    1.  
    1. b.
    1. *Al is the tallest kid than the others in class.

It happens, then, that by applying Bobaljik’s reasoning more formally, we have arrived at the conclusion from semantics that the syntactic relationship between CMPR and SUP is a branching rather than a nesting structure.

3.2 Syntax

The semantic combination order we have established is almost enough to yield the syntax we presented earlier, repeated in (31). We have added the category head a, although we will not treat the semantics of category heads here.

We have also not said anything about labeling. In this, we take replacement tests to be definitive: CMPR can appear without SUP, but not vice versa, in the same distribution; thus CMPR and SUP together form a CMPRP. An aP can appear without a CMPRP, but not vice versa, in the same distribution; thus a and not CMPR forms the label. And since aP is already complex, CMPRP is a specifier.

3.3 Morphology

Now we can give an analysis of the analytic–synthetic alternation in English. The details will be revised after the discussion of much in 4.3, where we present a new syntax, but we present this basic version so that we can relate our syntax to the morphological typology presented by Bobaljik.

Summarizing our first proposal: for CMPR and SUP to form a single Ф-domain, head movement or lowering applies obligatorily to combine them. The category head is affixed to the root in a similar way. Local dislocation, targeting CMPR and a, then combines the two Ф-domains into synthetic forms, for certain adjectives. This operation is triggered by a lexical marking feature [+SC] on those adjectives that percolates from the root to a.10 We now review the details.

To motivate some of the technical details, we will preview what we are going to say about the SSG: we suggest that CMPR and SUP originating in a specifier position is crucial. In particular, we claim that local dislocation is restricted with regard to what it can do with specifiers: the morphology is prevented, or almost completely prevented, from making reference to the internal parts of specifiers.

The transfer to morphology yields sequences of heads rather than constituents. Such sequences can correspond to a specifier by being the sequence of heads that is the yield of that specifier. Head movement and lowering label the complex X0 structures that they output; a complex Ф-domain with a label can be represented as a label × sequence pair (LS-pair), (32).11

    1. (32)
    1. <label, sequence of heads in the Ф-domain>

We assume that local dislocation can only target complex Ф-domains by their labels. With this in mind, our analysis is that the derivation stops at (33a) if there is no [+SC] feature, yielding an analytic form, and proceeds to (33b) if there is, yielding a synthetic form (Ф-domain boundaries are marked with << and linear adjacency with ͡  ).

    1. (33)
    1. a.
    1. ≪ < CMPR, CMPR ͡ SUP > ͡ ≪ <a[+sc], ROOT ͡ a[+sc] > (LD)
    1.  
    1. b.
    1. ≪ < a[+sc], ROOT ͡ a[+sc] > ͡ < CMPR, CMPR ͡ SUP >

In an English analytic comparative, the degree morphology is realized as [mɔr] or [most]. In synthetic forms, the degree morphology is realized as a suffix containing a vowel subject to reduction, either [V̆r] or [V̆st].12 Vocabulary insertion rules that give the correct surface forms are given in (34) (analytic more/most, comparative/superlative suffixes, and root suppletion in good, better, best, worse, and worst).

    1. (34)
    2.  
    1. Vocabulary insertion rules (version 1)
    1. CMPR
    2.  
    3.  
    4.  
    5.  
    6.  
    7. SUP
    8. GOOD
    9.  
    10.  
    11. BAD
    12.  
    1. ø
    2. s
    3. V̆s
    4. V̆ɹ
    5. mos
    6. mɔɹ
    7. t
    8. bɛs
    9. bɛt
    10. gʊd
    11. wʌr
    12. bӕd
    1. / < a, GOOD > ͡ — ͡ SUP
    2. / < a, BAD > ͡ —
    3. /a ͡ — ͡ SUP
    4. /a ͡ —
    5. /— ͡ SUP
    6.  
    7.  
    8. /— ͡ <CMPR, SUP>
    9. /— ͡ CMPR
    10.  
    11.  
    12. /— ͡ CMPR

To make these rules work, and give the correct surface forms, we make the following assumptions. First, we assume that the environment of a vocabulary insertion rule is limited to material within a single Ф-domain, and that labels are preserved following local dislocation, including when local dislocation combines two complex Ф-domains that each have their own labels, as in (33).

Second, the context made visible to vocabulary insertion for a particular head is one item adjacent on its left and on its right. Each item may either be an LS-pair or a simple head. Context restrictions in VI rules can refer to heads or be pairs of the form <l, r>, with r consisting of exactly one head. A head l in the context restriction of a VI rule will match against an instance of l in the context or against a pair labeled l. A pair <l, r> will match against an LS-pair labeled l whose sequence starts with r (if the context restriction is on the right), or ends with r (if it is on the left).

Finally, null heads are pruned from the context representation for vocabulary insertion (Embick 2010). More precisely: when vocabulary insertion assigns a head a null realization, subsequent heads undergoing vocabulary insertion will not see that head in their context, either as a simple head or as a member of a sequence in an LS-pair. Crucially, however, a null realization of a head l does not remove l from LS-pairs <l, s>.13 Within this framework, the rules in (34) derive the correct surface forms, as the reader can verify.

3.4 Typology

By giving syntactic specifiers a special status in the morphology, we derive the SSG (SUP cannot undergo local dislocation on its own or trigger local dislocation of a complex affix corresponding to CMPR + SUP) and the CSG (SUP cannot be a trigger for allomorphy unless CMPR is also a trigger).

Access to the internal parts of specifiers is restricted by imposing the principle in (35). This principle ensures that a complex Ф-domain corresponding to a specifier will have the syntactic head of the whole constituent as a morphological label, regardless of whether it was formed by head movement or lowering. So, SUP cannot be targeted for local dislocation when it has affixed with CMPR in the specifier. In any language in which SUP and CMPR combine to form a complex Ф-domain, they will only ever be able to combine with the adjective by a rule that combines CMPR with the adjective independently.

    1. (35)
    2.  
    1. A single Ф-domain that contains exactly the yield of a specifier in the syntax is labelled in the morphology with the syntactic label of that specifier.

What if a language does not combine SUP and CMPR into one Ф-domain? We need to block the possibility that SUP is targeted by local dislocation in isolation, extracting out of the specifier to affix with a linearly adjacent adjective (violating the SSG). The principle in (36) takes care of this issue. If a language does not combine SUP and CMPR into one Ф-domain, (36) prevents local dislocation from specifically extracting SUP or CMPR from the specifier. This rules out an SSG-violating derivation in which local dislocation targets SUP’s Ф-domain alone,14 and it gives a derivation for languages like Ossetian (see section (8)) where the comparative and the superlative are independent.

    1. (36)
    2.  
    1. If a Ф-domain is properly contained within the yield of a specifier in the syntax, local dislocation cannot target it by a morphological label.

As for the CSG, we impose principle (37). Principle (37) says that context restrictions on vocabulary insertion rules cannot specify pairs except as a special case. The ban is lifted in the vocabulary insertion list for GOOD in (34), where there is a rule (for bett-) sensitive to CMPR. That licenses the rule for bes-, sensitive to <CMPR, SUP>.15

    1. (37)
    2.  
    1. A vocabulary insertion list containing a rule sensitive to a pair <l,r> must also contain a rule with only l in its environment.

These principles are a particular way of saying that specifiers are special in the morphology, and that complex morphological objects more generally are special for vocabulary insertion. Naturally, they make them special in exactly the way we need them to be in order to yield the attested typology. Presumably, further research could falsify them, or could reduce them to something deeper.

4 Applying the NCC: the case of much

4.1 Semantics

We now revise our analysis beyond the basic version presented above. Within the domain of comparatives, applying the logic of the NCC leads to more decomposition within superlative (and comparative) forms. In fact, it leads to just the sort of decomposition proposed by Bresnan (1973), in which comparatives and superlatives uniformly contain instances of a morpheme MUCH.

Bresnan’s morphosyntactic analysis of data like that in (38) and (39) decomposed the form more into two morphemes, on a par with the analysis of expressions like as much, so much, and too much. Our conclusion is going to be that the NCC suggests the same conclusion: more hides the presence of two pieces—CMPR and MUCH.

    1. (38)
    1. a.
    1. Mary bought more coffee than John did.
    1.  
    1. b.
    1. Mary bought as much coffee as John did.
    1.  
    1. c.
    1. Mary bought so much coffee.
    1.  
    1. d.
    1. Mary bought too much coffee.
    1. (39)
    1. a.
    1. Mary ran more than John did.
    1.  
    1. b.
    1. Mary ran as much as John did.
    1.  
    1. c.
    1. Mary ran so much.
    1.  
    1. d.
    1. Mary ran too much.

In nominal and verbal degree constructions, much is generally taken to play an important semantic role (see Heim 1985; Bhatt & Pancheva 2004; Hackl 2009, among others). As pointed out by Cresswell (1976), in some cases its presence or absence can make the difference between a demonstration of an entity (40)a and a degree (40)b.

    1. (40)
    1. a.
    1. John buys this coffee.
    1.  
    1. b.
    1. John buys this much coffee.

What of its semantics? The literature holds that MUCH introduces measure functions—that is, dimensions for measurement—for nominal and verbal predicates.16 It has a signature property: which measure function it introduces in a given case is determined in part by the predicate, and in part by the context. We discuss this property in some detail so that we can show later that it is also found in adjectival comparatives.

In (41), we see examples where the dimensions for measurement differ along with different predicates: for instance, emotional intensity in (41)a, energy in (41)b, or informativity in (41)c. (These data are based on Schwarzschild 2006.)

    1. (41)
    1. a.
    1. Mary has as much love for John as for Bill.
    1.  
    1. b.
    1. There is too much heat in this room.
    1.  
    1. c.
    1. Don’t give me so much information.

Yet, more than one dimension is also possible even with the same predicates. The possibility of this is what allows two otherwise contradictory-seeming equatives to be simultaneously true, if the intended dimensions for measurement differ, (42). (These data are based on Cartwright 1975.)

    1. (42)
    1. a.
    1. We have as much water as sand (by volume).
    1.  
    1. b.
    1. We don’t have as much water as sand (by weight).

Wellwood (2015) formalizes ⟦MUCH⟧ using a variable μ over measure function-types, whose value is fixed by the assignment function A.17,18 Which measure functions are permissible values of μ depends on what sort of thing α is (an entity, an eventuality, etc). In (43), A(μ) is typed for functions of type ⟨η,d⟩, where η indicates neutrality with respect to the types e (entities) and v (eventualities).

    1. (43)
    1. MUCHA = λα.A(μ)(α)           ⟨η, d

In the context of cross-categorial comparatives, the interpretation of the equative head is as in (44). It differs from the interpretation we have so far assumed for CMPR just in ≥ rather than > (see Schwarzschild 2008 for discussion of ≥ rather than = here).

    1. (44)
    1. ⟦AS⟧A = λgλdλα.g(α) ≥ d           ⟨⟨η, d⟩, ⟨d, ⟨η, t⟩⟩⟩

Comparatives with more show interpretive properties parallel to equatives with as much: they give rise to interpretations in terms of different measures across predicates, (45), as well as within predicates, (46).

    1. (45)
    1. a.
    1. Mary has more love for John than for Bill.
    1.  
    1. b.
    1. We need more heat in this room.
    1.  
    1. c.
    1. He doesn’t want more information.
    1. (46)
    1. a.
    1. There is more water than sand (by volume).
    1.  
    1. b.
    1. There is more sand than water (by weight).

By the NCC, this means that more hides the structure of MUCH, in addition to CMPR. The alternative, in which a distinct comparative head incorporates the same semantics as MUCH, is not possible.

Explicitly, the interpretations of the relevant possible CMPR heads are given as in (47). ⟦CMPR1A lexically encodes a contextually-determined measure function, whereas ⟦CMPR2A is merely the ⟦CMPRA we assumed previously for adjectival comparatives, appropriately generalized.

    1. (47)
    1. a.
    1. CMPR1A = λdλα.A(μ) (α)> d           ⟨d, ⟨η, t⟩⟩
    1.  
    1. b.
    1. CMPR2A = λgλdλα.g(α)> d           ⟨⟨η, d⟩, ⟨d, ⟨η, t⟩⟩⟩

The result of composing ⟦MUCHA with ⟦CMPR2A delivers, by FA, the same interpretation as ⟦CMPR1A, (48). In light of this derivation, ⟦CMPR1A contains ⟦MUCHA, (49). Thus we deduce by the NCC that MUCH is present in nominal and verbal comparatives.

    1. (48)
    1. a.
    1. MUCH CMPR2A = ⟦CMPR2A(⟦MUCHA)           FA
    1.  
    1. b.
    1. = [λgλdλx.g(x) > d]([λx’.A(μ)(x’)])
    1.  
    1. c.
    1. =λdλx.[λx’.A(μ)(x’)](x) > d
    1.  
    1. d.
    1. = λdλx.A(μ)(x)>d
    1. (49)
    2.  
    1. MUCHA is contained within ⟦CMPR1A since: FA (⟦CMPR2A, ⟦MUCHA) = ⟦CMPR1A.

Previously, we assumed that adjectives lexically introduce their own measure functions. On Wellwood’s (2012; 2015) account, adjectives express predicates of states (50), which can be measured by ⟦MUCH⟧ just as bits of coffee (51a) or portions of running events (51b) can be.19,20

    1. (50)
    1. TALLA = λs.tall(s)           ⟨v,t
    1. (51)
    1. a.
    1. ⟦COFFEE⟧A = λx.coffee(x)           ⟨e,t
    1.  
    1. b.
    1. ⟦RUN⟧A = λe.run(e)           ⟨v,t

The idea that MUCH is present in nominal and verbal comparatives is not particularly controversial from the perspective of semantics. The idea that MUCH is present in adjectival comparatives is more controversial. We present four pieces of evidence suggesting that this is nevertheless the case.

Our first piece of evidence is that the same kind of semantic variability is detectable here, in terms of which dimensions for measurement are possible. The following examples show variability across the predicates red, expensive, and tall, as well as within these predicates.

Adjectival comparatives with red can be interpreted as involving different dimensions.21 Intuitively, there can be two patches of red lipstick, such that it is possible to say that one patch is redder than another by brightness, (52)a, while the opposite relation obtains by saturation, (52)b.

    1. (52)
    1. a.
    1. This lipstick is redder than that lipstick (by brightness).
    1.  
    1. b.
    1. That lipstick is redder than this lipstick (by saturation).

To see the pattern with expensive, imagine you are comparing prices on Amazon US and Amazon France. On Amazon US, a one week supply of Soylent costs $193.68, and a pair of Camper Men’s 18304 Pelotas XL Sneaker (size 41) costs $195.90. On Amazon France, the same amount of Soylent costs €370.49, and the Pelotas cost €139.00. In this context, both (53)a and (53)b can be true.

    1. (53)
    1. a.
    1. The Pelotas are more expensive than Soylent (on Amazon US).
    1.  
    1. b.
    1. Soylent is more expensive than the Pelotas (on Amazon France).

Finally, to see the pattern with tall, consider the case of Mount Everest and Mauna Kea, a dormant volcano in Hawaii. Typically, Mount Everest is thought to be the tallest mountain in the world, at around 29,000 feet. Yet, such a measure only considers the extent of the mountain above sea level; in terms of absolute extent, Mauna Kea is taller, at around 33,000 feet. This state of affairs can be truthfully summarized as in (54).

    1. (54)
    1. a.
    1. Mount Everest is taller than Mauna Kea (in extent above sea level).
    1.  
    1. b.
    1. Mauna Kea is taller than Mount Everest (in absolute extent).

Our second piece of evidence is Bresnan’s (1973) observation of cases in which much surfaces overtly with adjectives, for example (55). If MUCH was barred from adjectival comparatives categorically, (55)b should be ungrammatical; yet, it is perfectly acceptable, and semantically indistinguishable from (55)a. On the present account, both sentences would contain MUCH underlyingly.

    1. (55)
    1. a.
    1. The plants may grow as high as 6 feet.
    1.  
    1. b.
    1. The plants may grow as much as 6 feet high.

Our third piece of evidence comes from Corver (1997), who, arguing for an analysis only slightly different from Bresnan’s, provides data that illustrate the same semantic point. In (56)a, too appears to combine with tall directly. Yet, when the pro-form so resumes the semantics of the adjective in (56)b, much surfaces, and the result is semantically indistinguishable from (56)a.

    1. (56)
    1. a.
    1. Mary is tall, in fact she is too tall.
    1.  
    1. b.
    1. Mary is tall, in fact she is too much so.

Our fourth and final piece of evidence concerns data from Greek. In this language, the equivalent of much that surfaces in nominal comparatives (57a) can optionally surface in adjectival comparatives (57b). (These data provided by A. Giannakidou, p.c.)

    1. (57)
    2.  
    1. a.
    2.  
    1. I
    2. The
    1. Maria
    2. Maria
    1. ipje
    2. drank.3SG
    1. pio
    2. -er
    1. poly
    2. much
    1. krasi
    2. wine
    1. apoti
    2. than.clausal
    1. o
    2. the
    1. Janis
    2. John
    1. ‘Mary drank more wine than John did.’
    1.  
    2.  
    1. b.
    2.  
    1. To
    2. The
    1. fagito
    2. food
    1. tis
    2. the.GEN
    1. Marias
    2. Mary.GEN
    1. itan
    2. was
    1. pio
    2. -er
    1. (poly)
    2. (much)
    1. nostimo
    2. delicious
    1. apoti
    2. than.clausal
    1. tou
    2. the.GEN
    1. Jani.
    2. John
    1. ‘Mary’s food was more delicious than John’s was.’

Finally, there is a reason internal to our theory to posit that the form much corresponds to MUCH (and means what it does) in (55)b, (56)b, and (57b). The alternative, which would allow for adjectives to continue to be interpreted as lexically introducing measure functions, would require much to be semantically vacuous in cases where it appears with adjectives. However, as we discuss in section 5.1, the NCC implies that there simply are no semantically vacuous heads.

We thus posit that MUCH is a regular feature of comparative constructions, and so is nested inside superlatives as well. Combined with the previous results, the possibilities for constituency are as in Figure 2.

Figure 2
Figure 2

Three options for four heads.

M1 is excluded for semantic reasons: CMPR needs access to the measure functions introduced by MUCH. The analysis that we have given is directly compatible with M2, since ⟦CMPR SUPA takes ⟦MUCHA as an argument (and this complex combines with an adjective, noun, or verb by Predicate Modification22). Semantically, this leaves open the possibility of assigning different types to support M3.

We do not explore this possibility here. There are two ways it could be made to work: either ⟦MUCH TALLA takes ⟦CMPR SUPA as an argument, or the other way around. The consequences of either approach would require bigger changes to the semantics, and be less consonant with previous literature, than is presently justifiable. Thus, we proceed assuming the constituency in M2.

A potential prediction of any account that posits MUCH uniformly in degree constructions, or indeed any account that would posit that measure functions are introduced separately from adjectives, is that we should find languages which have no degree constructions. If such a language lacked a morpheme like MUCH, which introduces the mapping to degrees, it would lack adjectival as well as nominal and verbal comparatives. This could be true of Washo (Bochnak 2013).

4.2 Syntax

Starting with M2, the same kinds of distributional facts as before lead us to posit the syntactic labels in (58). Specifically, MUCH is always present in degree constructions, but CMPR and SUP are not; conversely, CMPR (and therefore SUP) cannot appear without MUCH. Thus MUCH forms the label for the new, more complex structure, rather than CMPR; as before, it is a specifier of a, for the same reason.

This syntax puts MUCH in a position where it could not, by itself, affix to a or the root, given the restrictions on head movement/lowering and the restrictions on local dislocation in specifiers proposed above. That has the consequence that the triggering “context” for the much/null MUCH alternation could not be adjacency to a, as that would require that they be in the same Ф-domain.

We propose instead that it is the result of Agree or selection between MUCH and the categorial head; the two resulting flavors of MUCH are notated as MUCH[+a] and MUCH[–a]. The absence of overt much with adjectives is therefore superficial, and does not afford any deep semantic explanation. We believe this comports with the facts from Greek discussed in the previous section. It is also consistent with the appearance of much in adjectival comparatives in other syntactic configurations (as much as, much so). In these cases, there is simply not an a head in the syntax to license MUCH[+a].

4.3 Morphology and typology

The presence of MUCH as a part of comparatives and superlatives leads us to revise our earlier morphological analysis somewhat. With respect to the analytic forms, more and most must now be combinations of CMPR or of the complex CMPR+SUP affix with MUCH, all in a single Ф-domain. To construct this single Ф-domain, MUCH affixes with CMPR, or with CMPR+SUP, either by head movement or by lowering.

The local dislocation rule we proposed before was triggered by CMPR. Now, given our syntax and the principle making the contents of specifiers invisible for that operation (beyond the label), this can no longer be stated. Instead, we now propose that it is the whole MUCH complex that moves, targeted by a local dislocation rule that combines MUCH with a, as in (59).

    1. (59)
    1. a.
    1. ≪ < MUCH, MUCH ͡ CMPR ͡ SUP > ͡ ≪ < a[+sc], ROOT ͡ a[+sc] > (LD)
    1.  
    1. b.
    1. ≪ < a[+sc], ROOT a[+sc] > ͡ < MUCH, MUCH ͡ CMPR ͡ SUP >

We propose the vocabulary insertion rules in (60). These capture the difference between adjectival and non-adjectival MUCH: as much wood, as much woodiness, but as woody.

    1. (60)
    2.  
    1. Vocabulary insertion rules (revised)
    1. MUCH[–a]
    2. MUCH
    3.  
    4. CMPR
    5.  
    6.  
    7.  
    8.  
    9. SUP
    10. GOOD
    11.  
    12.  
    13. BAD
    14.  
    1. mʌtʃ
    2. m
    3. ø
    4. ø
    5. V̆s
    6. V̆ɹ
    7. os
    8. ɔɹ
    9. t
    10. bɛs
    11. bɛt
    12. gʊd
    13. wʌr
    14. bӕd
    1. / <<—<<
    2. / <<— ͡ CMPR
    3.  
    4. / < a, GOOD > ͡ — ͡ SUP
    5. /a ͡ — ͡ SUP
    6. /a ͡ —
    7. /— ͡ SUP
    8.  
    9.  
    10. /— ͡ <MUCH, SUP>
    11. /— ͡ MUCH
    12.  
    13. /— ͡ MUCH
    14.  

This strengthens the CSG. The more general CSG predicted under our theory is as in (61). For a given root, our vocabulary insertion principle dictates that there must be one suppletive form that is triggered just by the presence of MUCH. This form will be the same across all the synthetic degree flavors.

    1. (61)
    2.  
    1. Comparative Superlative Generalization (generalized)
    2. An adjective root cannot have suppletion in only one synthetic degree construction.

Welsh has, in addition to comparative and superlative synthetic forms, a synthetic equative form (the realization of AS, we assume): for example, brau, “fragile,” breu-ach, “more fragile,” breu-af, “most fragile,” breu-ed, “as fragile.” The generalized CSG predicts an ABBB pattern, borne out in bach, “small,” llai, “smaller,” llei-af, “smallest,” llei-ed, “as small.”23 Other adjectives show different suppletive forms in different degree constructions, but, as far as we can see, none show suppletion in only one while the others are transparent.

As for the SSG, the new analysis implies that any affixal complex undergoing local dislocation will be targeted by the label MUCH, not CMPR. This has nothing to say about the typology of other degree items in the position of CMPR; these can freely undergo or fail to undergo affixation with MUCH, thereby allowing or blocking a synthetic form. It does predict that, in English, and any language with synthetic comparatives, there should also be a hypothetical synthetic form that appears if and when MUCH appears on its own (adjective + MUCH). According to the semantic analysis of MUCH that we have assumed, however, it is not possible for MUCH to appear without a degree operator.

5 Consequences & extensions

The NCC has consequences beyond the analysis of analytic and synthetic comparatives and superlatives. We briefly consider some of these before concluding.

5.1 Vacuous morphemes

The NCC predicts that there can be no vacuous morphemes.

Consider a trivial example involving the head we call ID in (62)a, which expresses the identity function on predicates. Applied to an arbitrary predicate like ⟦COW⟧ in (62)b, the interpretation of the composition of these two functions is identical to that of ⟦COW⟧ itself, (62)c. If a head like ID were in the space of possible denotations, it would be contained within the meaning of every predicate. By the NCC, either ID is not in the space of possible denotations, or COW does not express a property shared by all and only the cows. Obviously, the conclusion is that ID is impossible.

    1. (62)
    1. a.
    1. ID⟧ = λP.P           ⟨⟨e,t⟩, ⟨e,t⟩⟩
    1.  
    1. b.
    1. COW⟧ = λx.cow(x)‚           ⟨e,t
    1.  
    1. c.
    1. ID COW⟧ = λx.cow(x)‚           FA(⟦ID⟧,⟦COW⟧) = ⟦COW

Areas where this conclusion is particularly relevant are the analysis of agreement and negative concord phenomena. Two standard views are that such elements are either ignored by the semantics (Chomsky 1995; Haegeman & Lohndal 2010), or not present at all until PF (Bobaljik 2008). We thus see no reason to posit the existence of elements that are interpreted by the semantic component, but which are nonetheless semantically vacuous.

5.2 Conjunction

An anonymous reviewer points to an interesting set of cases where the typological predictions of the NCC might be fruitfully exhibited: the type polymorphism of Boolean coordinators like and (Partee & Rooth 1983).

Consider the standard compositional interpretation for and in (63)a, in which it conjoins two propositions of type t. A variant interpretation for and that can be used to conjoin two predicates of type ⟨e,t⟩ is as in (63)b. As should be clear, (63)b can be derived from (63 a by means of the type-shifter UPAND in (63)c. (Note that these representations involve a different semantic type for verbs than we have assumed in this paper.)

    1. (63)
    1. a.
    1. AND1⟧ = λpλq.p Λ q           ⟨t, ⟨t, t⟩⟩
    1.  
    1. b.
    1. AND2⟧ = λPλQλx.P(x) Λ Q(x)           ⟨⟨e,t⟩, ⟨⟨e,t⟩, ⟨e,t⟩⟩⟩
    1.  
    1. c.
    1. UPAND⟧ = λRλPλQλx.R(P(x))(Q(x))           ⟨TYPE (⟦AND1⟧), ⟨⟨e,t⟩, ⟨⟨e,t⟩, ⟨e,t⟩⟩⟩⟩

The variant AND1 can be used to handle cases of sentential coordination, (64a), and AND2 to handle verbal coordination, (64b), so that (64b) needn’t be analyzed as a reduced form of (64a). The interpretation derived for both of these sentences would be as in (64c).

    1. (64)
    1. a.
    1. John walks and John talks.
    1.  
    1. b.
    1. John walks and talks.
    1.  
    1. c.
    1. talk(j) Λ walk(j)

The NCC predicts that grammars do not allow AND2 and UPAND to coexist in the lexicon, or AND2 and AND1. If we make the simplifying assumption that this type shifter is always present, we predict that a language could never have the AND2 meaning without the AND1 meaning. The typological literature here is inconclusive: it shows that languages may have different morphophonological realizations of coordination across levels of syntactic structure (sentential, verbal, and so on), but does not indicate whether the existence of the sentential coordinator implies the other types (see Haspelmath 2007 and references therein, and also WALS Feature 64A).

5.3 2 versus 3 place comparative heads

The same reviewer points out that the NCC could play a role in the debate currently being waged over the status of 2-place versus 3-place CMPR.24 The main debate concerns the syntax-semantics of examples like (65), in particular whether the semantic type of ⟦CMPR⟧ is the same in both the “clausal comparative” in (65a) and the “phrasal comparative” in (65b), as well as whether these types are the same for surface-equivalents in other languages.

    1. (65)
    1. a.
    1. Mary is taller than John is.
    1.  
    1. b.
    1. Mary is taller than John.

Bhatt & Takahashi (2011), building on Kennedy 1999 (see also relevant discussion and references in Lechner 2001; Merchant 2009; Kennedy 2007; Alrenga, Kennedy & Merchant 2012), compared English and Hindi-Urdu comparatives like (65). They determined that English phrasal and clausal comparatives, and Hindi-Urdu clausal comparatives, involve the interpretation in (66a), but Hindi-Urdu additionally makes use of (66b) for its phrasal comparatives.

    1. (66)
    1. a.
    1. CMPR2⟧ = λDλDʹ.d[Dʹ(d) & ¬D(d)]           ⟨⟨d,t⟩, ⟨d,t⟩⟩
    1.  
    1. b.
    1. CMPR3⟧ = λxλgλy.∃d[g(y,d) & ¬g(x,d)]           ⟨e, ⟨⟨d,⟨e,t⟩⟩, ⟨e,t⟩⟩⟩

An alternative, and truth-conditionally equivalent, way of formulating the semantics of CMPR3 is as in (67a). In light of this formulation, and as Bhatt & Takahashi and others note, it is possible to derive the interpretation of CMPR3 from CMPR2 straightforwardly via a type-shift like UPCMPR in (67b). Thus, ⟦CMPR2⟧ and ⟦CMPR3⟧ stand in a containment relationship.

    1. (67)
    1. a.
    1. CMPR3ALT⟧ = λxλgλy.⟦CMPR2⟧({d | g(x,d)})({d | g(y,d)})           ⟨e, ⟨⟨d, ⟨e,t⟩⟩, ⟨e,t⟩⟩⟩
    1.  
    1. b.
    1. UPCMPR⟧ = λ M λxλg. M ({d | g(x,d)})({d | g(y,d)})           ⟨TYPE(⟦CMPR2⟧), TYPE(⟦CMPR3⟧)⟩

As with the previous case of conjunction, the NCC thus predicts that no language can have both CMPR3 and UPCMPR, or CMPR2 and CMPR3. That is, a language either handles (65a) and (65b) uniformly, or it analyzes the phrasal comparative using a shifted version of the interpretation in (66a). In other words, again making the simplifying assumption that the type shifter is always available, a language couldn’t display the CMPR3 meaning without displaying the CMPR2 meaning. If Hindi-Urdu has both, and if English has only ⟦CMPR2⟧, then these are two examples at least consistent with this prediction.

5.4 Negation

E. Chemla (p.c.) points out that negative quantifiers, antonyms, and comparatives with less are problematic from the perspective of the NCC as we have presented it. (An anonymous reviewer points out that the character of this problem likely extends much further as well.)

To see the issue, consider possible interpretations of the quantificational determiners NO and SOME. Suppose that ⟦NO⟧ is represented as in (68). How is SOME interpreted? Truth-conditionally, it could equally well be represented as in (69)a or (69)b. Importantly, the direction of containment between no and some depends on which of these forms is “correct.”

    1. (68)
    1. NO⟧ = λPλQ.¬∃x[P(x) & Q(x)]
    1. (69)
    1. a.
    1. SOME
    1.  
    1. b.
    1. = λPλQ.∃x[P(x) & Q(x)]
    1.  
    1. c.
    1. = λPλQ.¬¬∃x[P(x) & Q(x)]

In order to preserve the NCC in light of such a challenge, we need some notion of the inherent complexity of meaning for a morpheme, one that cuts finer than truth-conditional equivalence. Something that can capture, for example, felt differences in meaning between sentences like (70)a and (70)b: (70)b is hard to even understand, let alone realize that it is truth-conditionally equivalent to (70)a.25

    1. (70)
    1. a.
    1. Mary is taller than John is.
    1.  
    1. b.
    1. Mary is less short than John is.

Resolving the facts surrounding negation will involve much more targeted study than we can possibly provide here, as it will require converging evidence from multiple sources. Typologically, we might expect to find a language in which no transparently maps to a piece meaning the same thing as some plus something else. It is also likely important that some combinations of functional elements and negation do not seem to be attested (for example, no *nand complements nor, no *nall appears next to none; Horn 1972).

Finally, it may be possible to test for meaning complexity via the cognitive operations or processes recruited during language understanding (see Clark & Chase 1972 specifically on negation, and Lidz et al. 2011 on linking semantic representations to “level 1.5” cognitive descriptions à la Peacocke 1986).

5.5 Analytic/synthetic violations

How does the analysis extend to the special English comparatives that Embick (2007) discusses, which seem to violate the analytic/synthetic marking in favor of analytic?

    1. (71)
    1. a.
    1. *John is lazier than stupid.
    1.  
    1. b.
    1. John is more lazy than stupid.

Abstracting away from many details, Morzycki (2011) posits that a so-called “metalinguistic” comparative like (71b) expresses that some property holds of John which is more similar to the property LAZY than how similar any property he has is to DUMB. This analysis can be adapted for the present account by positing that Embick’s silent morpheme κ takes a property of adjectival states s to a property of states sʹ that are “similar” to s, ssʹ.26

    1. (72)
    1. ⟦κ⟧ = λPλs.∃sʹ[P(sʹ & ssʹ]           ⟨⟨v,t⟩, ⟨v,t⟩⟩

Such a proposal would be incompatible with the constituency K1 in Figure 3, since ⟦CMPR-SUP⟧ wouldn’t have access to the “similarity states” that it measures and compares. It is straightforwardly compatible with K2; K3 would require re-typing ⟦κ⟧. Morphologically, both K2 and K3 can capture the facts: κ’s intervention in K2 would block linear adjacency of the MUCHP to the aP; equally, the presence of κ as the head of the specifier in K3 would relabel it morphologically, and keep the local dislocation trigger MUCH from being visible.

Figure 3
Figure 3

Three options for four heads.

This is just a sketch, of course. Giannakidou & Yoon (2011) raise some concerns for Morzycki’s semantics, and leverage cross-linguistic data in service of theirs. It remains to be seen whether and how these proposals and discussion can be firmly accommodated within the present theory, and how they bear on the choices in Figure 3.

6 Conclusion

What is the purpose of the NCC? It narrows the set of semantic analyses for any particular set of data. Linguists often attempt to decompose as much as possible in their analyses. The NCC properly codifies that methodological intuition as a falsifiable claim about the human faculty of language. Yet, as far as the linguistic evidence in a given language goes, the NCC is decidedly non-empirical. That is the whole point: the grammatical constraint rules out all but one of several competing, equally good analyses, which narrows the field of possibilities for acquisition.

One source of evidence that the linguist has access to that the language acquisition device does not is typology. The analysis we have given for comparatives based on the NCC is nicely consistent with Bobaljik’s morphological typology; the competing, previous explanation, while reasonable, has technical problems when it is combined with the local dislocation analysis that the data suggest for English comparative formation. Further evidence from implicational universals is also relevant, as discussed in the previous section.

In section 3, we promised to discuss the fact that our semantic formalism provides no general procedure for determining in which order arguments must be taken. This problem is quite general, and has deep implications. For example, the analysis of determiners as expressing relations between sets reveals a number of shared interpretive properties that are cross-linguistically robust (Barwise & Cooper 1981). One such property is conservativity (i.e., ⟦DET⟧ (X)(Y) ⇔ ⟦DET⟧(X)(YX)): determiner relations “live on” the set denoted by their NP complement, as can be seen in the truth-conditional equivalence of (73).

    1. (73)
    1. a.
    1. Every dog is brown.           PQ
    1.  
    1. b.
    1. Every dog is brown and a dog.           PQP

If every is interpreted as in (74a), this equivalence is captured. Yet, it is easy to imagine a quantifier just like EVERY but with the order of the λs reversed, (74b). The hypothetical ⟦SCHMEVERY⟧ would fail conservativity: while PQ implies PQQ, PQQ fails to imply PQ. While the conservativity generalization is robust, the semantic formalism that we’ve chosen only allows it to be captured descriptively (see Pietroski 2005); it doesn’t inherently constrain the set of possible interpretations for individual heads.

    1. (74)
    1. a.
    1. EVERY⟧ = λPλQ.P⊆Q
    1.  
    1. b.
    1. SCHMEVERY⟧ = λQλP.P⊆Q

Being able to freely swap the order of arguments of ⟦SUP⟧ to have λ G λ G λx rather than λ G λ G λx (section (29)) would require a syntax in which the superlative is contained within the comparative, and not the other way around. This would undermine the explanation of the morphological typology. There are probably many more such typological facts, which could turn out to be important in informing semantic theory: constraining the semantic formalism, and ultimately the space of possible denotations.

Competing Interests

The authors declare that they have no competing interests.

Notes

  1. There are some apparent exceptions in English, where it seems as if more tall is acceptable. These have been discussed elsewhere, and they are only apparent. See section 5.5 for a short discussion in the context of our proposal. [^]
  2. See also relevant discussion and references in Harley (2004); Husband (2011); Beavers & Koontz-Garboden (2012). [^]
  3. In considering (10), it is important to recognize what containment means and what it does not mean. It means that, in terms of grammatical triggers (the aspects of the syntax that would trigger morphological operations), the superlative contains all the same ones as the comparative. Thus, any morphological phenomenon that happens in the comparative should also happen in the superlative, because all the crucial elements of the comparative are there too. It does not mean that (what surfaces as) the comparative is found as a single identifiable syntactic sub-constituent of the superlative, which is not directly relevant. The “sub-constituent” interpretation is not what Bobaljik intended, either, given that he actually proposes (10) to satisfy (8) for superlatives in Finnish and related languages. In (10), CMPR is always present when SUP is, and, since CMPR is the head of the specifier, a structure with SUP in it will always contain CMPR as well: it will always be there as an active syntactic object. [^]
  4. There are almost no cases in the literature of local dislocation over a head movement trace. The analysis of Maltese object clitics by Shwayder (2014) is the only such case we have been able to find. Under the assumptions of that analysis, head-movement traces do not block local dislocation: accordingly, verbs can head-move to form a complex with aspect and agreement suffixes, and object clitics attach to this complex on the right by local dislocation, as in (i).
      1. (73)
      1. a.
      1. Agr[ Asp[ v[ ROOT [ object ]D HM
      1.  
      1. b.
      1. [ ROOT + v + Asp + Agr ][ t[ t[ t[ object ]D LD
      1.  
      1. c.
      1. [ ROOT + v + Asp + Agr ][ t[ t[ t[ object ]D LD
    In general, it seems unlikely that traces would block an operation triggered under true linear adjacency. [^]
  5. A structure like (17) with superlatives could be blocked for independent reasons. If the structurally-higher position for amazingly in (17) is related to the possibility of differentials in comparatives like Mary is two inches taller than Abdellah, superlatives do not allow these: *Mary is the two inches tallest has no interpretation (see Stateva 2003). [^]
  6. All accounts of the CSG need to be taken in conjunction with a principle ruling out accidental homophony such that B=A or C=A only in form. [^]
  7. The major alternative degree-theoretic treatment analyzes ⟦CMPR⟧ as type ⟨⟨d,t⟩, ⟨⟨d, ⟨e,t⟩⟩, ⟨e,t⟩⟩⟩ (the “degree-relational analysis”: Heim 1985, 2000, among others). We use the lower-typed version mainly for simplicity, but recall this version below to illustrate that our semantic proposal works either way. [^]
  8. Perhaps conspicuously absent from the representation in (23b) is the context variable C posited by Heim (1999) and others to help capture, in part, particular readings of superlative constructions like John wants to climb the highest mountain. Consideration of such data is beyond the scope of this paper; see Szabolcsi (1986) and Heim (1999) for early discussion. [^]
  9. A parallel story can be told when adopting the degree relation-based analysis of the gradable predicate (i.e., type ⟨d, ⟨e,t⟩⟩), as opposed to the measure function-based analysis adopted here; the relevant interpretations would be as in (ia)-(ic) below. Note that these representations assume that the than-clause delivers a degree predicate, type ⟨d,t⟩, rather than a degree d.
      1. (i)
      2.  
      1. CMPR⟧ = λDλgλx.∃d[g(x) ≥ d & d > max(D)]           ⟨⟨d,t⟩, ⟨⟨d, ⟨e,t⟩⟩, ⟨e,t⟩⟩
      2. SUP1⟧ = λgλx.∀y[yx → ∃d[g(x) ≥ d & d > max({d | g (y) ≥ d})]]           ⟨⟨d, ⟨e,t⟩⟩, ⟨e,t⟩⟩
      3. SUP2⟧ = λ G λgλx.∀y [yx G ({d | g(y) ≥ d})(g)(x)]           ⟨TYPE(⟦CMPR⟧), ⟨⟨d, ⟨e, t⟩⟩, ⟨e,t⟩⟩⟩
    [^]
  10. The idea that there is a binary diacritic feature that licenses the affixation (an idea we borrow from Bobaljik) should not be misunderstood. The application or non-application of affixation in these forms is somewhat variable (Graziano-King & Cairns 2005), and it is correlated, imperfectly, with certain phonological properties of the stem, which suggests some amount of generative capacity rather than a simple table look-up. Monosyllabic stems generally, but not always, undergo the affixation, plus many forms ending in -y (which pattern with monosyllables in other respects too: Chomsky & Halle 1968). Obvious exceptions are huge/*huger, fun/?funner. However, variable stem marking is probably also subject to many non-grammatical decision processes that are difficult to dissociate from true grammatical productivity. We do not see any serious problems that would arise if some grammatical visibility of the root phonology were allowed in this case; but the visibility issues that are raised by these are complex enough without raising this additional dimension, about which we have little to say. [^]
  11. Whether local dislocation gives a label to its whole output, such that it could be the object of further local dislocations, is another question, one which we will not deal with because the issue does not arise here. The literature has not dealt with the possibility of successive local dislocations either. A too-powerful interface can easily overgenerate (see Bjorkman & Dunbar 2016), and for local dislocation to be able to target the whole ?-domain output by another local dislocation would change its character as a “linear” operation substantially. However, we leave this open; our notation of morphological labels in LS-pairs is just notation, and does not imply any claim that there is no additional structure. We have ignored the import of labels for later reordering operations within a ?-domain, as with the “subword dislocation” cases discussed in Embick & Noyer (2001) and later work for simplicity; labels are key to delimiting their scope in that literature. [^]
  12. Certain English vowels are reduced by the general rules of English phonology when they are not stressed. In -est we get the default reduced vowel (which is in fact better transcribed as [i] than as a [ә]: Flemming & Johnson 2007). In -er we seem to get the phonetic output that is often transcribed as the amalgamated segment [ɝ], also just the expected phonetic value for any reduced vowel in this context. [^]
  13. This assumes that vocabulary insertion takes place sequentially. We assume that the insertion of suffixes happens left to right from the root (rather than inside-out with respect to the syntactic structure, as proposed by Embick 2010 and Bobaljik & Wurmbrand 2013); except for roots, which are inserted after null head pruning. [^]
  14. Principle (36) allows local dislocation out of a complex specifier (and into a complex specifier, as in the Latin -que example above), but only if it is indifferent to the syntactic category of the element inside the specifier. [^]
  15. With the limitation of context restrictions to one adjacent head, we also predict that adjacency of SUP to the root, or effective adjacency due to the intervening items being null, should be a necessary condition for SUP–triggered allomorphy. This is consistent with the ABC cases presented in Bobaljik (2012). For example, the Latin (ABC) superlative opt-im-us is unlike other Latin superlatives in that others generally have extra segmental content between the stem and the superlative affix -im-, as in long-iss-im-us, “longest,” pulcher-r-im-us, “most beautiful.” We predict that the presence of such material blocks contextual allomorphy of the root triggered by SUP. [^]
  16. We stick with the measure function terminology and types adopted in section 3.1. [^]
  17. See Schwarzschild (2006), Nakanishi (2007), Wellwood, Hacquard & Pancheva (2012), and Wellwood (2012), (2015), for extensive discussion on restrictions on permissible values of μ variables. Solt (2014) offers a related analysis for a covert counterpart of MUCH, and Wellwood (2014) offers some skepticism of index-based approaches to MUCH. [^]
  18. Note that we are assuming bare occurrences of much (i.e., Much wine spilled) involve a covert POS morpheme; see von Stechow (1984) and Kennedy (1999). And we set aside the question of differential comparatives in general, including those with much (i.e., Mary drank much more wine than John did). [^]
  19. See Pelletier (1974), Cartwright (1975) for nouns like coffee, and Parsons (1990), Kratzer (1996) for verbs like run, among others. Landman (2000) and Fults (2006) also offer a state-based analysis of adjectives (cf. Francez & Koontz-Garboden’s (2015) “abstract substance”-based approach). The proposal in the text is reminiscent of Park (2008) (that measure functions are introduced separately from adjectives) and Husband (2012) (that adjectives, at some level, involve states). An alternative analyzes gradable adjectives as predicates of individuals (e.g., Klein 1980, 1982 and Burnett 2012); incorporating this alternative into the present theory would require bridging delineation semantics and degree semantics, a task beyond the scope of this paper. [^]
  20. Does such an analysis predict sentences like (ia) to intuitively entail sentences like (ib)? We assume not, following Francez and Koontz-Garboden (in press): the inference to ‘taller than some standard’ in bare adjectival constructions like (ib) is akin to other familiar cases of domain restriction with existentially quantified expressions, which often invoke some contextually-salient amount or ordering for judgments of felicity/truth. See their paper for details.
      1. (74)
      1. a.
      1. Mary is taller than John is.
      1.  
      1. b.
      1. Mary is tall.
    [^]
  21. Kennedy & McNally (2010) argue that color terms are lexically ambiguous. Yet, (52) requires only fixing on the “quality” rather than “category” understanding of red. [^]
  22. This rule can be specified as below, with neutrality between the types of entities and events: Predicate Modification If α is a branching node, {β,γ} the set of α’s daughters, and ⟦βA and ⟦γA are both in Dη,t, then ⟦αA = λσ: σ ∈ Dη.⟦βA (σ) & ⟦γA (σ) [^]
  23. The vowel alternations in both cases are due to regular vowel mutation in non-final syllables: [ai] → [ɘi], [ai] → [ɘi]. [^]
  24. This debate specifically involves a quantificational analysis of CMPR that we have not discussed in this paper, yet the logic of how the NCC would apply here is clear enough. [^]
  25. Büring (2007) decomposes less and short into two pieces, both involving the morpheme LITTLE (Heim 2006). His decomposition of short has been challenged by Heim (2008). Heim’s analysis leaves some important questions open, and only some of the important judgments have been formally investigated (Beck 2013). [^]
  26. Furthermore, (72) could be easily generalized to account for sentences like Mary is more a semanticist than a syntactician. [^]

Acknowledgements

The authors wish to thank Emmanuel Chemla, Jeff Lidz, and three anonymous Glossa reviewers for their helpful comments and suggestions. This work was supported in part by the European Research Council under the European Union’s Seventh Framework Programme (FP/2007-2013) / ERC Grant Agreement ERC-2011-AdG-295810 BOOTPHON, from the Agence Nationale pour la Recherche (ANR-2010-BLAN-1901-1 BOOTLANG), from the Fondation de France, ANR-10-IDEX-0001-02 and ANR-10-LABX-0087.

References

Adger, David; Bejar, Susana; Harbour, Daniel . (2003).  Directionality of allomorphy: a reply to Carstairs–Mccarthy.  Transactions of the philological society 101 (1) : 109. DOI: http://dx.doi.org/10.1111/1467-968X.00111

Alrenga, Peter; Kennedy, Chris; Merchant, Jason . (2012). A new standard of comparison In:  Arnett, Nathan, Bennett, Ryan Ryan (eds.),   Proceedings of the 30th West coast conference on formal linguistics. Somerville, MA: Cascadilla Proceedings Project, pp. 32.

Baker, Mark . (1985).  Incorporation: a theory of grammatical function changing. PhD Thesis. Massachusetts Institute of Technology.

Bartsch, Renate; Vennemann, Theo . (1972).  Semantic structures: A study in the relation between semantics and syntax. Frankfurt am Main: Athenaum.

Barwise, John; Cooper, Robin . (1981).  Generalized quantifiers and natural language.  Linguistics and Philosophy 4 : 159. DOI: http://dx.doi.org/10.1007/BF00350139

Beavers, John; Koontz-Garboden, Andrew . (2012).  Manner and result in the roots of verbal meaning.  Linguistic Inquiry 43 (3) : 331. DOI: http://dx.doi.org/10.1162/LING_a_00093

Beck, Sigrid . (2013).  Lucinda driving too fast again–the scalar properties of ambiguous than-clauses.  Journal of Semantics 30 : 1. DOI: http://dx.doi.org/10.1093/jos/ffr011

Bhatt, Rajesh; Pancheva, Roumyana . (2004).  Late merger of degree clauses.  Linguistic Inquiry 35 (1) : 1. DOI: http://dx.doi.org/10.1162/002438904322793338

Bhatt, Rajesh; Takahashi, Shoichi . (2011).  Reduced and unreduced phrasal comparatives.  Natural Language and Linguistic Theory 29 : 581. DOI: http://dx.doi.org/10.1007/s11049-011-9137-1

Bjorkman, Bronwyn; Dunbar, Ewan . (2016).  Finite-state phonology predicts a typological gap in cyclic stress assignment.  Linguistic Inquiry 47

Bobaljik, Jonathan . (1995).  The syntax of verbal inflection. PhD Thesis. Massachusetts Institute of Technology.

Bobaljik, Jonathan . (2008). Where’s Ф? Agreement as a post-syntactic operation In:  Harbour, Daniel, Adger, David; David and Béjar, Susana Susana (eds.),   Oxford University Press, pp. 295. Phi-Theory: Phi features across interfaces and modules.

Bobaljik, Jonathan . (2012).  Universals in comparative morphology. Cambridge, MA: MIT Press.

Bobaljik, Jonathan; Wurmbrand, Susi . (2013).  Suspension across domains.  Distributed Morphology Today: Morphemes for Morris Halle, : 185. DOI: http://dx.doi.org/10.7551/mitpress/9780262019675.003.0011

Bochnak, M. Ryan . (2013).  Cross-linguistic variation in the semantics of comparatives. PhD thesis. University of Chicago.

Büring, Daniel . (2007). Cross-polar nomalies In:  Friedman, Tova, Gibson, Masayuki Masayuki (eds.),   Proceedings of Semantics and Linguistic Theory 17. Ithaca, NY: Cornell University, pp. 37.

Burnett, Heather . (2012).  The grammar of tolerance: on vagueness, context-sensitivity and the origin of scale structure. PhD thesis. UCLA.

Cartwright, Helen . (1975).  Amounts and measures of amount.  Noûs 9 (2) : 143. DOI: http://dx.doi.org/10.2307/2214598

Chomsky, Noam . (1957).  Syntactic structures. The Hague: Mouton.

Chomsky, Noam . (1995).  The minimalist program. Cambridge, MA: MIT Press.

Chomsky, Noam; Halle, Morris . (1968).  The sound pattern of English. Harper & Row.

Clark, Herbert H.; Chase, William G. . (1972).  On the process of comparing sentences against pictures.  Cognitive Psychology 3 : 472. DOI: http://dx.doi.org/10.1016/0010-0285(72)90019-9

Compton, Richard; Pittman, Christine . (2010).  Word-formation by phase in Inuit.  Lingua 120 (9) : 2167. DOI: http://dx.doi.org/10.1016/j.lingua.2010.03.012

Corver, Norbert . (1997).  MUCH-support as a last resort.  Linguistic Inquiry 28 (1) : 119. http://www.jstor.org/stable/4178967

M. J. Cresswell, (1976). The semantics of degree In:  Partee, Barbara Hall (ed.),   Montague grammar. New York: Academic Press, pp. 261.

Dowty, David R. . (1979).  Word meaning and Montague grammar. Dordrecht, The Netherlands: Kluwer Academic Publishers, 7 p. 415.

Embick, David . (2007).  Blocking effects and analytic/synthetic alternations.  Natural Language & Linguistic Theory 25 (1) : 1. DOI: http://dx.doi.org/10.1007/s11049-006-9002-9

Embick, David . (2010).  Localism versus globalism in morphology and phonology. Cambridge, MA: MIT Press.

Embick, David; Noyer, Rolf . (2001).  Movement operations after syntax.  Linguistic Inquiry 32 : 555. DOI: http://dx.doi.org/10.1162/002438901753373005

Flemming, Edward; Johnson, Stephanie . (2007).  Rosa’s roses: reduced vowels in American English.  Journal of the International Phonetic Association 37 : 83. DOI: http://dx.doi.org/10.1017/S0025100306002817

Fodor, Janet Dean . (1970).  Three reasons for not deriving “kill” from “cause to die”.  Linguistic Inquiry 1 (4) : 429. http://www.jstor.org/stable/4177587

Fodor, Jerry A.; Lepore, Ernie . (1998).  The emptiness of the lexicon: reflections on James Pustejovsky’s The Generative Lexicon.  Linguistic Inquiry 29 (2) : 269. DOI: http://dx.doi.org/10.1162/002438998553743

Francez, Itamar; Koontz-Garboden, Andrew . (2015).  Semantic variation and the grammar of property concepts.  Language 91 : 533. DOI: http://dx.doi.org/10.1353/lan.2015.0047

Fults, Scott . (2006).  The structure of comparison: An investigation of gradable adjectives. PhD thesis. University of Maryland.

Giannakidou, Anastasia; Yoon, Suwon . (2011).  The subjective mode of comparison: Metalinguistic comparatives in Greek and Korean.  Natural Language and Linguistic Theory 29 : 621. DOI: http://dx.doi.org/10.1007/s11049-011-9133-5

Graziano-King, Janine; Cairns, Helen Smith . (2005).  Acquisition of English comparative adjectives.  Journal of Child Language 32 : 345. DOI: http://dx.doi.org/10.1017/S0305000904006828

Hackl, Martin . (2009).  On the grammar and processing of proportional quantifers: most versus more than halfNatural Language Semantics 17 : 63. DOI: http://dx.doi.org/10.1007/s11050-008-9039-x

Hacquard, Valentine . (2006).  Aspects of Modality. PhD thesis. MIT.

Haegeman, Liliane; Lohndal, Terje . (2010).  Negative concord and (multiple) agree: a case study of West Flemish.  Linguistic Inquiry 41 (2) : 181. DOI: http://dx.doi.org/10.1162/ling.2010.41.2.181

Halle, Morris; Marantz, Alec . (1993). Distributed morphology and the pieces of inflection In:  Hale, Kenneth, Keyser, Samuel Jay Samuel Jay (eds.),   The view from building 20. Cambridge, MA: MIT Press.

Harley, Heidi . (2004).  Wanting, having, and getting: a note on Fodor and Lepore 1998.  Linguistic Inquiry 35 (2) : 255. DOI: http://dx.doi.org/10.1162/002438904323019066

Haspelmath, Martin . (2007).  Shopen, Timothy (ed.),   Coordination.  Language Typology and Syntactic Description. 2nd edition Cambridge University Press.

Heim, Irene . (1985). Notes on comparatives and related matters In:  University of Texas, Austin. Unpublished manuscript.

Heim, Irene . (1999).  Notes on superlatives.  MIT. Unpublished manuscript, Available at: http://semanticsarchive.net/Archive/TI1MTlhZ/Superlative.pdf.

Heim, Irene . (2000). Degree operators and scope In:  Jackson, Brendan, Matthews, Tanya Tanya (eds.),   Proceedings of SALT X. Cornell University, Ithaca, NY: CLC Publications, pp. 40. DOI: http://dx.doi.org/10.3765/salt.v10i0.3102

Heim, Irene . (2006). LITTLE In:  Gibson, Masayuki, Howell, Jonathan Jonathan (eds.),   Proceedings of Semantics and Linguistic Theory 16. Ithaca, NY: Cornell University, pp. 35.

Heim, Irene . (2008). Decomposing antonyms? In:  Gronn, Atle (ed.),   Proceedings of Sinn und Bedeutung 12. Oslo: ILOS, pp. 212.

Heim, Irene; Kratzer, Angelika . (1998).  Semantics in generative grammar. Malden, MA: Blackwell.

Horn, Laurence . (1972).  On the semantic properties of the logical operators in English. Bloomington, IN: Indiana University Linguistics Club.

Husband, E. Matthew . (2011).  Rescuing manner/result complementarity from certain death.  Proceedings of the 47th annual Chicago Linguistics Society. Forth-coming.

Husband, E. Matthew . (2012).  On the compositional nature of states 188 Linguistik Aktuell/Linguistics Today.

Katz, Jerrold J.; Fodor, Jerry A. . (1963).  The structure of a semantic theory.  Language 39 (2) : 170. DOI: http://dx.doi.org/10.2307/411200

Kennedy, Chris . (1999).  Projecting the adjective: the syntax and semantics of gradability and comparison. New York: Garland.

Kennedy, Chris . (2007).  Modes of comparison.  Proceedings of the Chicago Linguistics Society. 43 : 141.

Kennedy, Chris; McNally, Louise . (2010).  Color, context, and compositionality.  Synthese 174 (1) : 79. DOI: http://dx.doi.org/10.1007/s11229-009-9685-7

Klein, Ewan . (1980).  A semantics for positive and comparative adjectives.  Linguistics and Philosophy 4 : 1. DOI: http://dx.doi.org/10.1007/BF00351812

Klein, Ewan . (1982).  The interpretation of adjectival comparatives.  Journal of Linguistics 18 (1) : 113. DOI: http://dx.doi.org/10.1017/S0022226700007271

Kratzer, Angelika . (1996). Severing the external argument from its verb In:  Rooryck, Johan, Zaring, Laurie Laurie (eds.),   Phrase structure and the lexicon. Dordrecht, The Netherlands: Kluwer Academic Publishers, pp. 109.

Kratzer, Angelika . (2008). On the plurality of verbs In:  Dölling, Johannes, Heyde-Zybatow, Tatjana; Tatjana and Schäfer, Martin Martin (eds.),   Event Structures in Linguistic Form and Interpretation. Berlin: Mouton de Gruyter.

Landman, Fred . (2000).  Events and plurality. Norwell, MA: Kluwer Academic Publishers.

Lechner, Winfried . (2001).  Reduced and phrasal comparatives.  Natural Language and Linguistic Theory 19 (4) : 683. DOI: http://dx.doi.org/10.1023/A:1013378908052

Levin, Beth; Hovav, Malka Rappaport . (2005).  Argument realization. Cambridge, UK: Cambridge University Press.

Lidz, Jeffrey; Halberda, Justin; Pietroski, Paul; Hunter, Tim . (2011).  Interface transparency and the psychosemantics of mostNatural Language Semantics, : 1. DOI: http://dx.doi.org/10.1007/s11050-010-9062-6

Marvin, Tatiana . (2002).  Topics in the stress and syntax of words. PhD Thesis. MIT.

Merchant, Jason . (2009).  Phrasal and clausal comparatives in Greek and the abstractness of syntax.  Journal of Greek Linguistics 9 : 134.

Morzycki, Marcin . (2011).  Metalinguistic comparison in an alternative semantics for imprecision.  Natural Language Semantics 19 : 39. DOI: http://dx.doi.org/10.1007/s11050-010-9063-5

Nakanishi, Kimiko . (2007).  Measurement in the nominal and verbal domains.  Linguistics and Philosophy 30 : 235. DOI: http://dx.doi.org/10.1007/s10988-007-9016-8

Park, So-Young . (2008).  Functional categories: the syntax of DP and DegP. PhD thesis. University of Southern California.

Parsons, Terence . (1990). Events in the semantics of English: a study in subatomic semantics In:  Current studies in linguistics series no. 19. Cambridge, Massachusetts: MIT Press, p. 334.

Partee, Barbara Hall; Rooth, Mats . (1983).  Bäuerle, Rainer, Schwarze, Christoph; Christoph and von Stechow, Arnim Arnim (eds.),   Generalized conjunction and type ambiguity.  Meaning, use and interpretation of language. de Gruyter, pp. 362.

Peacocke, Christopher . (1986).  Explanation in computational psychology: Language, perception and level 1.5.  Mind and Language 1 : 101.

Pelletier, Francis Jeffry . (1974).  On some proposals for the semantics of mass nouns.  Journal of Philosophical Logic 3 : 87. http://www.jstor.org/stable/30226085

Pietroski, Paul . (2005).  Events and semantic architecture. Oxford, UK: Oxford University Press.

Pustejovsky, James . (1995).  The Generative Lexicon. Cambridge, MA: MIT Press.

Schwarzschild, Roger . (2006).  The role of dimensions in the syntax of noun phrases.  Syntax 9 (1) : 67. DOI: http://dx.doi.org/10.1111/j.1467-9612.2006.00083.x

Schwarzschild, Roger . (2008).  The semantics of comparatives and other degree constructions.  Language and Linguistics Compass 2 (2) : 308. DOI: http://dx.doi.org/10.1111/j.1749-818X.2007.00049.x

Shwayder, Kobey . (2014).  Interaction of phonology and morphology in Maltese and Makassarese clitics.  University of Pennsylvania Working Papers in Linguistics 20 (1) : 301.

Solt, Stephanie . (2014).  Q-adjectives and the semantics of quantity.  Journal of Semantics 32 : 1. DOI: http://dx.doi.org/10.1093/jos/fft018

Stateva, Penka . (2003). Superlative more In:  Young, Robert B., Zhou, Yuping Yuping (eds.),   Proceedings of SALT XIII. Cornell University, Ithaca, NY: CLC Publications, pp. 276. DOI: http://dx.doi.org/10.3765/salt.v13i0.2893

Szabolcsi, Anna . (1986).  Fukui, Naoki, Rapoport, Tova; Tova and Sagey, Elizabeth Elizabeth (eds.),   Comparative superlatives.  MIT Working Papers in Linguistics 8 : 245.

Szabolcsi, Anna . (2012). Compositionality without word boundaries: (the) more and (the) most In:  Proceedings of Semantics and Linguistic Theory 22. Cornell University, Ithaca, NY: CLC Publications, pp. 1. DOI: http://dx.doi.org/10.3765/salt.v22i0.2629

Travis, Lisa . (1984).  Parameters and effects of word order variation. PhD Thesis. MIT.

von Fintel, Kai . (1999).  NPI licensing, Strawson entailment, and context dependency.  Journal of Semantics 16 : 97. DOI: http://dx.doi.org/10.1093/jos/16.2.97

von Stechow, Arnim . (1984).  Comparing semantic theories of comparison.  Journal of Semantics 3 (1) : 1. DOI: http://dx.doi.org/10.1093/jos/3.1-2.1

Wellwood, Alexis . (2012). Back to basics: more is always much-er In:  Chemla, Emmanuel, Homer, Vincent; Vincent and Winterstein, Grégoire Grégoire (eds.),   Proceedings of Sinn und Bedeutung. 17 Paris: ENS.

Wellwood, Alexis . (2014).  Measuring predicates. PhD thesis. University of Maryland, College Park.

Wellwood, Alexis . (2015).  On the semantics of comparison across categories.  Linguistics and Philosophy 38 (1) : 67. DOI: http://dx.doi.org/10.1007/s10988-015-9165-0

Wellwood, Alexis; Hacquard, Valentine; Pancheva, Roumyana . (2012).  Measuring and comparing individuals and events.  Journal of Semantics 29 (2) : 207. DOI: http://dx.doi.org/10.1093/jos/ffr006