Documentation Index
Fetch the complete documentation index at: https://docs-v1.agno.com/llms.txt
Use this file to discover all available pages before exploring further.
Code
cookbook/models/ollama/basic_stream.py
from typing import Iterator # noqa
from agno.agent import Agent, RunResponse # noqa
from agno.models.ollama import Ollama
agent = Agent(model=Ollama(id="llama3.1:8b"), markdown=True)
# Get the response in a variable
# run_response: Iterator[RunResponse] = agent.run("Share a 2 sentence horror story", stream=True)
# for chunk in run_response:
# print(chunk.content)
# Print the response in the terminal
agent.print_response("Share a 2 sentence horror story", stream=True)
Usage
Create a virtual environment
Open the Terminal and create a python virtual environment.python3 -m venv .venv
source .venv/bin/activate
Install libraries
pip install -U ollama agno
Run Agent
python cookbook/models/ollama/basic_stream.py