Gender-Preserving Embeddings

Word embeddings attempt to represent the meaning of words using lower-dimensional real-valued vectors such that words that have similar meaning are embedded close to each other in the embedding space. Counting-based approaches that create word embeddings in a top-down fashion have been overtaken by prediction-based methods, which (randomly) initialise the embeddings and subsequently update them such that the words in their co-occurrence contexts can be accurately predicted. Arguably word2vec¹ is the most popular suite of algorithms for learning word embeddings but the history of neural embeddings dates back to the work by Bengio (2008)....

March 9, 2019 · 4 min · Danushka