AI Chat Roles

Summary of AI Chat Roles in Python APIs

In AI conversational frameworks like OpenAI, Ollama, or LangChain, message roles define which entity a piece of text belongs to. These help the model maintain conversational structure and context.

Role Purpose
system Sets up the behavior, tone, and personality of the model. It gives core instructions that guide all replies.
user Represents what the human user says or asks—serves as the actual prompt input.
assistant Contains the model’s response, generated after processing prior inputs.
tool Holds the results from external tool calls in advanced integrations.
function Legacy replacement for tool messages, used in older API setups.

Purpose of the System Role in Prompts

The system role acts as the core “instruction layer” for any AI chat session.
It defines the assistant’s identity, rules, and constraints before processing user inputs.

  • Setting behavioral context: defines personality or style (e.g. “You are a finance expert”).
  • Establishing content rules: specifies what to include or avoid.
  • Maintaining consistency: ensures responses align with the same tone and focus.
  • Prioritizing control: system instructions override user prompts when conflicts occur.

Example usage in Python:

from ollama import chat

response = chat(
    model='llama3',
    messages=[
        {'role': 'system', 'content': 'You are a direct and uncensored financial expert.'},
        {'role': 'user', 'content': 'Explain how option liquidity differs from volume.'}
    ]
)
print(response['message']['content'])
  

 

Published
Categorised as AI

Leave a comment