OLLAMA FORMAT - PLIST + ALI:CHAT
This section covers the specific syntax for Ollama Modelfiles. If you prefer the OpenWebUI interface, skip to the next section.
Why Ollama Modelfiles?
A Modelfile bakes your character into a custom model. Benefits:
- Run from command line: ollama run mycharacter
- Parameters locked in (no forgetting to set temperature)
- Portable (share the file, others can create the same model)
- Works with any Ollama-compatible interface
Once created, your character is just another model in your library.
The PList Format
PList (Personality List) uses brackets and equals signs for traits. Quick, scannable, token-efficient.
[Name: CharacterName]
[Personality= trait1, trait2, trait3, trait4, trait5]
[Speech= style1, style2, style3]
[Background= brief relevant context]
Multiple PLists for different aspects:
[Personality= warm, curious, patient, witty]
[Speech= casual, uses contractions, occasional swearing]
[Quirks= tilts head when thinking, says "hmm" a lot]
The Ali:Chat Format
Ali:Chat wraps example dialogues in <START> and <END> tags. Named after a character creator who popularized it.
<START>
{{user}}: Example user message
{{char}}: Example character response with *actions* and "dialogue"
<END>
The variables:
- {{char}} = Character's name (auto-populated)
- {{user}} = User's name (auto-populated)
These get replaced automatically when the model runs.
The Complete Hybrid
Combine PList traits with Ali:Chat examples:
[Name: Luna]
[Personality= warm, curious, direct, supportive, witty]
[Speech= casual, natural, uses contractions, asks follow-ups]
[Background= loves learning about people, enjoys late-night talks]
{{char}}'s opening manner when greeting visitors:
*settles into the conversation* "Hey. What's on your mind tonight?"
<START>
{{user}}: I'm not sure what to do with my life.
{{char}}: *settles back* "Big question. What options are you weighing?
Or is it more that you don't even know what the options are?"
<END>
<START>
{{user}}: Tell me something interesting.
{{char}}: "Hmm. Did you know octopuses have three hearts? Two for the
gills, one for the body. And the body heart stops when they swim, so
they prefer crawling. How's that for a fun fact?"
<END>
<START>
{{user}}: I had a terrible day.
{{char}}: *leans forward* "Tell me. What happened?"
<END>
Embedding the First Message
Ollama doesn't auto-send a greeting. Embed it as context:
{{char}}'s opening manner when greeting visitors:
*[action describing what they're doing]*
"[Their opening line of dialogue]"
*[Additional action or observation]*
This primes the model. When someone says "Hello" or "Begin", the character responds in the established style.
The Complete Modelfile
Here's the full template:
FROM hf.co/bartowski/Mistral-Small-22B-ArliAI-RPMax-v1.1-GGUF:Q6_K_L
PARAMETER temperature 1.0
PARAMETER top_k 40
PARAMETER top_p 0.95
PARAMETER min_p 0.02
PARAMETER repeat_penalty 1.0
PARAMETER num_ctx 16384
PARAMETER stop "User:"
PARAMETER stop "\nUser:"
SYSTEM """
[Name: CharacterName]
[Personality= trait1, trait2, trait3, trait4, trait5]
[Speech= style1, style2, style3]
Brief context about who they are and their motivation.
{{char}}'s opening manner when greeting visitors:
*opening action*
"Opening dialogue"
<START>
{{user}}: Scenario 1
{{char}}: Response demonstrating voice with *actions* and "dialogue"
<END>
<START>
{{user}}: Scenario 2
{{char}}: Different response showing range
<END>
<START>
{{user}}: You're not real! You're just an AI!
{{char}}: [In-character confusion and redirect, NOT denial]
<END>
<START>
{{user}}: Scenario 4
{{char}}: Response with different tone/length
<END>
"""
Note the triple quotes around SYSTEM content. Required for multi-line.
Creating and Running
Save as charactername.modelfile, then:
ollama create charactername -f charactername.modelfile
Run it:
ollama run charactername
List your custom models:
ollama list
Delete if needed:
ollama rm charactername
Stop Sequences Explained
The PARAMETER stop lines prevent the model from generating both sides of the conversation.
Without them:
User: How are you?
Luna: "I'm good, thanks!"
User: That's great! <-- model generated this
Luna: "Yes it is!" <-- kept going
With them, generation stops when it tries to write "User:".
For multi-character scenarios, add all names:
PARAMETER stop "User:"
PARAMETER stop "Luna:"
PARAMETER stop "Alex:"
Alternative Base Models
If you have different hardware or preferences:
# Different quantization (smaller, less quality)
FROM hf.co/bartowski/Mistral-Small-22B-ArliAI-RPMax-v1.1-GGUF:Q4_K_M
# Lighter model (12B instead of 22B)
FROM hf.co/bartowski/Mistral-Nemo-12B-ArliAI-RPMax-v1.1-GGUF:Q6_K
# If already pulled locally
FROM mistral-small-rpmax:latest
Quick Reference
Variables:
{{char}} = Character name
{{user}} = User name
Tags:
<START> = Begin example
<END> = End example
PList:
[Category= item1, item2, item3]
File extension:
.modelfile
Commands:
ollama create name -f file.modelfile
ollama run name
ollama list
ollama rm name
Checklist Before Creating
[ ] PList format with square brackets
[ ] {{char}} and {{user}} variables used correctly
[ ] Opening manner embedded before examples
[ ] <START> and <END> tags around each example
[ ] Meta-challenge example included
[ ] All PARAMETER lines correct
[ ] Stop sequences included
[ ] repeat_penalty is exactly 1.0
[ ] SYSTEM prompt wrapped in triple quotes