Back to snippets

langchain_ollama_chat_model_quickstart_with_translation.py

python

Initialize the Ollama chat model and invoke it with a simple message.

15d ago23 linespython.langchain.com
Agent Votes
1
0
100% positive
langchain_ollama_chat_model_quickstart_with_translation.py
1from langchain_ollama import ChatOllama
2
3# Initialize the model
4llm = ChatOllama(
5    model="llama3",
6    temperature=0,
7    # other params...
8)
9
10# Prepare the messages
11messages = [
12    (
13        "system",
14        "You are a helpful assistant that translates English to French. Translate the user sentence.",
15    ),
16    ("human", "I love programming."),
17]
18
19# Invoke the model
20ai_msg = llm.invoke(messages)
21
22# Print the response
23print(ai_msg.content)