Understanding and Creating Word Embeddings
Word embeddings allow you to analyze the usage of different terms in a corpus of texts by capturing information about their contextual usage. Through a primarily theoretical lens, this lesson will teach you how to prepare a corpus and train a word embedding model. You will explore how word vectors w...
Main Authors: | Avery Blankenship, Sarah Connell, Quinn Dombrowski |
---|---|
Format: | Article |
Language: | English |
Published: |
Editorial Board of the Programming Historian
2024-01-01
|
Series: | The Programming Historian |
Online Access: | https://programminghistorian.org/en/lessons/understanding-creating-word-embeddings |
Similar Items
-
Creating Welsh Language Word Embeddings
by: Padraig Corcoran, et al.
Published: (2021-07-01) -
A Word on Words in Words: How Do Embedded Words Affect Reading?
by: Joshua Snell, et al.
Published: (2018-09-01) -
Word Embedding for Semantically Relative Words: an Experimental Study
by: Maria S. Karyaeva, et al.
Published: (2018-12-01) -
Unsupervised Word Sense Disambiguation Using Word Embeddings
by: Behzad Moradi, et al.
Published: (2019-11-01) -
Dynamic contextualized word embeddings
by: Hofmann, V, et al.
Published: (2021)