رکورد قبلیرکورد بعدی

" Lexical propensities in phonology: "


Document Type : Latin Dissertation
Language of Document : English
Record Number : 904918
Doc. No : TL0rz210sn
Main Entry : Zymet, Jesse
Title & Author : Lexical propensities in phonology:\ Zymet, JesseHayes, Bruce P; Zuraw, Kie
College : UCLA
Date : 2018
student score : 2018
Abstract : Traditional theories of phonological variation propose that morphemes be encoded with descriptors such as [+/- Rule X], to capture which of them participate in a variable process. More recent theories predict that morphemes can have LEXICAL PROPENSITIES: idiosyncratic, gradient rates at which they participate in a process—e.g., [0.7 Rule X]. This dissertation argues that such propensities exist, and that a binary distinction is not rich enough to characterize participation in variable processes. Corpus investigations into Slovenian palatalization and French liaison reveal that individual morphemes pattern across an entire propensity spectrum, and that encoding individual morphemes with gradient status improves model performance. Furthermore, an experimental investigation into French speakers’ intuitions suggests that they internalize word- specific propensities to undergo liaison. The dissertation turns to modeling language learners’ ability to acquire the idiosyncratic behavior of individual attested morphemes while frequency matching to statistical generalizations across the lexicon. A recent model based in Maximum Entropy Harmonic Grammar (MaxEnt) makes use of general constraints that putatively capture statistical generalizations across the lexicon, as well as lexical constraints governing the behavior of individual words. A series of learning simulations reveals that the approach fails to learn statistical generalizations across the lexicon: lexical constraints are so powerful that the learner comes to acquire the behavior of each attested form using only these constraints, at which point the general constraint is rendered ineffective. A GENERALITY BIAS is therefore attributed to learners, whereby they privilege general constraints over lexical ones. It is argued that MaxEnt fails to represent this property in its current formulation, and that it be replaced with the hierarchical MIXED-EFFECTS LOGISTIC REGRESSION MODEL (MIXED-EFFECTS MAXENT), which is shown to succeed in learning both a frequency-matching grammar and lexical propensities, by encoding general constraints as fixed effects and lexical constraints as a random effect. The learner treats the grammar and lexicon differently, in that vocabulary effects are subordinated to broad, grammatical effects in the learning process.
Added Entry : Hayes, Bruce P; Zuraw, Kie
Added Entry : UCLA
کپی لینک

پیشنهاد خرید
پیوستها
عنوان :
نام فایل :
نوع عام محتوا :
نوع ماده :
فرمت :
سایز :
عرض :
طول :
0rz210sn_12727.pdf
0rz210sn.pdf
پایان نامه لاتین
متن
application/pdf
6.75 MB
85
85
نظرسنجی
نظرسنجی منابع دیجیتال

1 - آیا از کیفیت منابع دیجیتال راضی هستید؟