Messages

Messages are the unit of communication in chat models. They are used to represent the input and output of a chat model
Message contains (Role, Content, Metadata).

message: {
    Role,
    Content,
    Metadata
}
Role: {
    assistant,          //Response from model, which can include text or a request to invoke tools.
    function(legacy),   //tool role should be used instead.
    system,             //tell the chat model how to behave and provide additional context
    tool,               //pass the results of a tool invocation back to the model
    user,               //tell the chat model how to behave and provide additional context
}
Content: {
    text,                           //simple text
    multimodal dictionary data,    // List of Images,audio, video
}
Metadata: {     //Depending on the chat model provider
    ID,         //An optional unique identifier for the message.
    Name        //An optional name property
    Metadata    //Additional information about the message(eg: timestamps, token usage)
    Tool Calls  //request made by the model to call one or more tools
}
        

LangChain Messages

See Messages first
LangChain provides unified message format that can be used across all chat models.
LangChain messages are Python objects that subclass from a BaseMessage
These are 5 message types
Role Message Example
Assistant AIMessage. This is response from Model. It will have following attributes:
Attribute Description
content String or list(See content in messages)
tool_calls Tool calls associated with the message
invalid_tool_calls Tool calls with parsing errors associated with the message
usage_metadata Usage metadata for a message, such as token counts.
id Optional Unique identifier
response_metadata Response metadata, e.g., response headers, logprobs, token counts

from langchain_core.messages import HumanMessage
ai_message = model.invoke([HumanMessage("Tell me a joke")])
ai_message # <-- AIMessage
                
Assistant role AIMessageChunk(Used for streaming responses)

for chunk in model.stream([HumanMessage("what color is the sky?")]):
print(chunk)
ai_message = chunk1 + chunk2 + chunk3 + ...
                
Tool ToolMessage(contains the result of calling a tool)

{
    Role,
    Content,
    tool_call_id,   //id of the call to the tool
    artifact
}
                
System SystemMessage
User HumanMessage

from langchain_core.messages import HumanMessage
model.invoke([HumanMessage(content="Hello, how are you?")])
                
No associated Role,Used to manage chat history RemoveMessage
Legacy FunctionMessage