Hands-on Distributional Semantics – From first steps to interdisciplinary applications

LaCo Foundationalweek 3 each day

Lecturers:
Stefan Evert (University of Erlangen-Nuremberg)
Gabriella Lapesa (University of Stuttgart)

Website: http://wordspace.collocations.de/doku.php/course:esslli2021:start

Abstract: Distributional semantic models (DSM) are based on the assumption that the meaning of a word can (at least to a certain extent) be inferred from its usage, i.e. its distribution in text. Therefore, these models dynamically build semantic representations through a statistical analysis of the contexts in which words occur. DSMs are a promising technique for solving the lexical acquisition bottleneck by unsupervised learning, and their distributed representation provides a cognitively plausible, robust and flexible architecture for the organisation and processing of semantic information.

In this introductory course we will highlight the interdisciplinary potential of DSM beyond standard semantic similarity tasks; our overview will touch upon cognitive modeling, theoretical linguistics, as well as computational social science. This course aims to equip participants with the background knowledge and skills needed to build different kinds of DSM representations and apply them to a wide range of tasks. There will be a particular focus on practical exercises with the help of user-friendly software packages and various pre-built models.