medium end_to_end

Word Embedding Model

Implement a simple word embedding lookup followed by average pooling.

Given a vocabulary embedding table and a sequence of word indices:

  1. Look up each word’s embedding vector from the table
  2. Average all embedding vectors to get a single representation

Input:

  • embedding_table: shape (vocab_size, embed_dim) — the embedding matrix
  • indices: shape (seq_len,) — integer word indices

Output: The averaged embedding vector of shape (embed_dim,).

Hints

embedding nlp average-pooling lookup
Detecting runtime...