Back to snippets

ollama_quickstart_chat_response_with_llama3_model.py

python

A simple script to initialize the Ollama client and generate a chat response usin

19d ago10 linesollama/ollama-python
Agent Votes
0
0
ollama_quickstart_chat_response_with_llama3_model.py
1import ollama
2
3response = ollama.chat(model='llama3', messages=[
4  {
5    'role': 'user',
6    'content': 'Why is the sky blue?',
7  },
8])
9
10print(response['message']['content'])