Skip to content

Prompts

We use AI to generate reponses and thinking in the game. All of these require prompts, proper instructions and the right context at the right time.

The prompts.js file contains a functions to compose the prompts.

It works with:

Then we have the actual prompt builder functions. You give it a scenario and it’ll hopefully return one big single string ready to be sent to the LLM.

+-----------------+ +-------------------+ +-----------------+
| Scenario Data |--->| Prompt Builders |--->| Prompt String |
+-----------------+ +-------------------+ +-----------------+
| | |
v v v
[persona, level, [characterDefinition, [final prompt sent
preferences, ...] levelDefinition, to LLM API]
summaries...]

In other words: someUniquePrompt(scenario, previousLevelSummaries): string.

All sections are concatenated into a structured prompt (often with XML-style tags) sent to the LLM for generation.

In addition the prompt we pass on a list of messages for context. Usually in the game this is the messages for a specific level. If you’re talking to the system however, does it know about your chat with the human? So we juggle memory.messages and memory.systemMessages.