← All Resources

LLM Probability Explorer

Watch a real language model predict the next word. Choose a sentence starter, see the probability distribution over possible next words, and build sentences one token at a time.

Log in to use this tool.

Pedagogical Goals

  • Show students what "next-token prediction" actually looks like in practice
  • Make the probability distribution over next tokens visible and tangible
  • Illustrate that LLMs don't "know" what to say — they assign probabilities to every possible continuation
  • Let students experience how context shapes predictions by building sentences incrementally

How It Works

The explorer sends prompts to GPT via the course API and requests the top-k token probabilities (logprobs). Students see a bar chart of the most likely next tokens and can click any token to append it to the sentence. The "auto-complete" mode lets the model pick tokens automatically, showing how sentences emerge from sequential probability sampling.

How It Was Built

Built as a client component that calls a dedicated API endpoint. The API forwards requests to Azure OpenAI with logprobs enabled. The visualization uses a horizontal bar chart showing token probabilities, with color coding to distinguish high-probability from low-probability continuations.