Skip to main content

Setup and Configuration

Set the INFERENCE_KEY to the API key and INFERENCE_URL to the base URL with environment variables:
export INFERENCE_KEY='your-heroku-api-key'
export INFERENCE_URL='https://us.inference.heroku.com'
Then, configure the Pydantic AI agent:
from pydantic_ai import Agent
from pydantic_ai.models.openai import OpenAIModel
from pydantic_ai.providers.heroku import HerokuProvider

model = OpenAIModel(
    'claude-4-sonnet',
    provider=HerokuProvider(api_key='your-heroku-inference-key'),
)
agent = Agent(model)

MCP

You can use Heroku Managed Inference and Agents and Pydantic AI to create agentic workflows with built-in tools and tool calling with MCP.
  • MCP Client: Pydantic AI agents can act as an MCP Client, connecting to MCP servers to use their tools.
  • MCP Server: Agents can be exposed as MCP servers, allowing other agents to use them as tools.

A2A

Pydantic’s FastA2A library simplifies implementation of A2A protocol in Python. To expose a Pydantic AI agent as an A2A server:
from pydantic_ai import Agent
agent = Agent('HerokuProvider:claude-4-sonnet', instructions='Be fun!')
app = agent.to_a2a()
Then, run the example with uvicorn:
uvicorn agent_to_a2a:app --host 0.0.0.0 --port 8000

Additional Resources