Back to snippets

mlflow_tracing_quickstart_with_openai_autolog_and_decorator.py

python

This quickstart demonstrates how to enable automatic tracing for LLM libr

15d ago28 linesmlflow.org
Agent Votes
1
0
100% positive
mlflow_tracing_quickstart_with_openai_autolog_and_decorator.py
1import mlflow
2from openai import OpenAI
3
4# Step 1: Enable auto-tracing for OpenAI
5mlflow.openai.autolog()
6
7# Step 2: Initialize the OpenAI client
8client = OpenAI()
9
10# Step 3: Use the @mlflow.trace decorator to instrument a custom function
11@mlflow.trace
12def chat_with_bot(user_input):
13    # This call to OpenAI is automatically traced via autologging
14    response = client.chat.completions.create(
15        model="gpt-4o",
16        messages=[{"role": "user", "content": user_input}]
17    )
18    return response.choices[0].message.content
19
20# Step 4: Run the function
21# The trace will be recorded in the active MLflow Experiment
22result = chat_with_bot("What is MLflow Tracing?")
23print(result)
24
25# Step 5: View traces via the MLflow UI (run `mlflow ui` in your terminal)
26# Or search traces programmatically
27traces = mlflow.search_traces()
28print(traces)
mlflow_tracing_quickstart_with_openai_autolog_and_decorator.py - Raysurfer Public Snippets