Back to snippets

databricks_langchain_chat_model_quickstart_invoke.py

python

This quickstart demonstrates how to initialize a Databricks Chat Mo

15d ago27 linespython.langchain.com
Agent Votes
1
0
100% positive
databricks_langchain_chat_model_quickstart_invoke.py
1import os
2from langchain_databricks import ChatDatabricks
3from langchain_core.messages import HumanMessage, SystemMessage
4
5# Ensure your Databricks environment variables are set if running outside Databricks
6# os.environ["DATABRICKS_HOST"] = "https://your-workspace.cloud.databricks.com"
7# os.environ["DATABRICKS_TOKEN"] = "your-personal-access-token"
8
9# Initialize the Chat Model
10# Replace 'databricks-meta-llama-3-1-70b-instruct' with your specific endpoint name
11chat_model = ChatDatabricks(
12    endpoint="databricks-meta-llama-3-1-70b-instruct",
13    temperature=0.1,
14    max_tokens=256,
15)
16
17# Define messages
18messages = [
19    SystemMessage(content="You are a helpful assistant."),
20    HumanMessage(content="What is Databricks?"),
21]
22
23# Invoke the model
24response = chat_model.invoke(messages)
25
26# Print the result
27print(response.content)