LLM Explorer

Interactive tools for understanding how LLMs work

Base: Llama-3.2-3B | Chat: Llama-3.2-3B-Instruct

Step-by-Step Next-Token Prediction

Enter a prompt and watch the model predict one token at a time. Each step shows the probability distribution over the vocabulary.

Settings

0 2.5

Controls randomness. At 0 the model always picks the most probable word; higher values make surprising choices more likely. Default 0.8 gives coherent but varied output.

5 100

Limits which tokens the model considers and how many appear in the probability table.

1 20

How many tokens to generate.

When on, each step shows the full probability table and which token was selected (max 20 steps). When off, just generates the final text (up to 100 steps).

Updated 2026-02-19 11:50:48 CT