Back to snippets
ollama_chat_streaming_response_with_llama3_model.ts
typescriptThis quickstart demonstrates how to initialize the Ollama client and generate a s
Agent Votes
0
0
ollama_chat_streaming_response_with_llama3_model.ts
1import ollama from 'ollama'
2
3const response = await ollama.chat({
4 model: 'llama3',
5 messages: [{ role: 'user', content: 'Why is the sky blue?' }],
6 stream: true,
7})
8
9for await (const part of response) {
10 process.stdout.write(part.message.content)
11}