🏗️ Ollama Modelfiles: Building Your Own Custom AI Model
What if you want an AI that always acts a certain way, without having to type out long instructions every single time? In Ollama, you can achieve this using a Modelfile.
Think of it as a recipe for creating your own version of a model. It is a simple text file that tells Ollama:
- Which base model to use (e.g., Llama 3, Gemma).
- What personality or behavior the model should have.
- What settings (like randomness/temperature) to apply.
🎯 The Task: Create a JSON-Only Custom Model
Let's create a custom model named jsonllm that is hardwired to always return data in JSON format.
📝 Step 1: Create the Modelfile
Create a plain text file on your computer and name it exactly Modelfile (no extension like .txt). Open it in Notepad or VS Code and add the following lines:
# 1. Choose the base model
FROM llama3.2:latest
# 2. Set the custom behavior
SYSTEM "You are a strict data assistant. You must always output your answers in valid, well-structured JSON format. Do not write any conversational text."
🔨 Step 2: Build the Model
Open your terminal, navigate to the folder where you saved your Modelfile, and run the create command:
ollama create jsonllm -f Modelfile
Ollama will read your recipe and build the new model. You should see an output like this:
🔍 Step 3: Verify the Model
Let's check if our new model is ready to use by listing all installed models:
ollama ls
Success! Your custom jsonllm model is officially installed and ready to be used just like any other model.
💻 Step 4: Testing the Custom Model in Python
Now, let's test our creation using the Ollama Python API. Notice that we don't need to pass any system prompts here, because the behavior is already baked into the model itself!
import ollama
# Call our newly created custom model
response = ollama.generate(
model='jsonllm:latest',
prompt="Culture of india"
)
print(response['response'])
📌 Expected Output
🚀 Conclusion
Modelfiles are incredibly powerful. Instead of cluttering your Python code with massive system prompts and configurations, you can build specialized models (like a sql-generator, a code-reviewer, or a json-formatter) and share them easily.
To learn about all the parameters you can tweak, check out the Official Ollama Modelfile Documentation.