Gemma 3 trees

Reducing LLMs' output to a single sample is an act of violence. We could have had blooming worlds of diversity; instead we got AI slop. This page is an attempt to strike back, if ever so slightly.

Three prompts, 10 Gemma 3 models (5 base and 5 instruct), and heuristic searches to find Out of (vocab size)5 = 2621445 = (218)5 = 290 possible ones; this means that we have located (depending on the model and prompt) between 29.79% to 97.85% of probability mass within 0.000000000000000000084% of all sequences of length 5. that cover the most probability mass, plus a 16-token (temp = 1) finisher for each of them.

Given that this page contains a total of ~31 million continuations, it is not meant to be read in its entirety by any single human - it is too large in a similar way as LLMs are too rich for any one person's comprehension. So my hope is that you open some random nodes and find... something funny?

Model size:

Model type:

View as:

Order: