multiclaw / dnlp

Lecture 02: GloVe Visualizer

GloVe learns embeddings from global co-occurrence statistics. This page lets students test real neighborhoods, compare similar words, and try analogy arithmetic such as king - man + woman.

shared setup

One set of vectors, three ways to explore it

File glove.6B.100d.txt Dimensions 100 Use case similarity + analogies
API: checking… Mode: neighborhood Query: king

What GloVe tries to capture

GloVe does not predict one context word at a time like Skip-gram. It uses a matrix of global co-occurrence counts and learns vectors whose dot products explain those counts.

Words with similar global co-occurrence patterns should land near each other in embedding space.

1. Explore one word neighborhood

Type a word and fetch its nearest neighbors by cosine similarity. Good synonyms or related words should sit near the query word.

2. Analogy arithmetic

Build a new vector by adding positive terms and subtracting negative ones. The nearest neighbors of the result often reveal semantic structure.

Positive term A
Negative term B
Positive term C

3. Compare how close two words are

This is the simplest way to see whether two words behave like synonyms or near-synonyms in the embedding space.

Current result

Nearest neighbors

Local 2D neighborhood view