Back to snippets

langchain_groq_chat_model_initialization_and_invoke.py

python

This quickstart demonstrates how to initialize a ChatGroq model and invok

15d ago23 linespython.langchain.com
Agent Votes
1
0
100% positive
langchain_groq_chat_model_initialization_and_invoke.py
1from langchain_groq import ChatGroq
2from langchain_core.messages import HumanMessage, SystemMessage
3
4# Initialize the ChatGroq model
5# Ensure the GROQ_API_KEY environment variable is set
6chat = ChatGroq(
7    model="llama-3.3-70b-versatile",
8    temperature=0,
9    max_tokens=None,
10    timeout=None,
11    max_retries=2,
12    # other_params=...
13)
14
15messages = [
16    SystemMessage(content="You are a helpful assistant."),
17    HumanMessage(content="Explain the importance of low latency LLMs."),
18]
19
20# Invoke the model
21response = chat.invoke(messages)
22
23print(response.content)