eprintid: 10132042
rev_number: 14
eprint_status: archive
userid: 608
dir: disk0/10/13/20/42
datestamp: 2021-07-30 10:58:20
lastmod: 2021-12-24 23:22:00
status_changed: 2021-07-30 10:58:20
type: article
metadata_visibility: show
creators_name: Purver, M
creators_name: Sadrzadeh, M
creators_name: Kempson, R
creators_name: Wijnholds, G
creators_name: Hough, J
title: Incremental Composition in Distributional Semantics
ispublished: inpress
divisions: UCL
divisions: B04
divisions: C05
divisions: F48
keywords: Incrementality, Semantics, Vector space semantics, Incremental disambiguation
note: Despite the incremental nature of Dynamic Syntax (DS), the semantic grounding of it remains that of predicate logic, itself grounded in set theory, so is poorly suited to expressing the rampantly context-relative nature of word meaning, and related phenomena such as incremental judgements of similarity needed for the modelling of disambiguation. Here, we show how DS can be assigned a compositional distributional semantics which enables such judgements and makes it possible to incrementally disambiguate language constructs using vector space semantics. Building on a proposal in our previous work, we implement and evaluate our model on real data, showing that it outperforms a commonly used additive baseline. In conclusion, we argue that these results set the ground for an account of the non-determinism of lexical content, in which the nature of word meaning is its dependence on surrounding context for its construal.
abstract: Despite the incremental nature of Dynamic Syntax (DS), the semantic grounding of it remains that of predicate logic, itself grounded in set theory, so is poorly suited to expressing the rampantly context-relative nature of word meaning, and related phenomena such as incremental judgements of similarity needed for the modelling of disambiguation. Here, we show how DS can be assigned a compositional distributional semantics which enables such judgements and makes it possible to incrementally disambiguate language constructs using vector space semantics. Building on a proposal in our previous work, we implement and evaluate our model on real data, showing that it outperforms a commonly used additive baseline. In conclusion, we argue that these results set the ground for an account of the non-determinism of lexical content, in which the nature of word meaning is its dependence on surrounding context for its construal.
date: 2021-07-07
date_type: published
publisher: SPRINGER
official_url: http://dx.doi.org/10.1007/s10849-021-09337-8
oa_status: green
full_text_type: pub
language: eng
primo: open
primo_central: open_green
verified: verified_manual
elements_id: 1878083
doi: 10.1007/s10849-021-09337-8
lyricists_name: Sadrzadeh, Mehrnoosh
lyricists_id: MSADR73
actors_name: Flynn, Bernadette
actors_id: BFFLY94
actors_role: owner
full_text_status: public
publication: Journal of Logic, Language and Information
pages: 28
citation:        Purver, M;    Sadrzadeh, M;    Kempson, R;    Wijnholds, G;    Hough, J;      (2021)    Incremental Composition in Distributional Semantics.                   Journal of Logic, Language and Information        10.1007/s10849-021-09337-8 <https://doi.org/10.1007/s10849-021-09337-8>.    (In press).    Green open access   
 
document_url: https://discovery.ucl.ac.uk/id/eprint/10132042/1/Purver2021_Article_IncrementalCompositionInDistri.pdf