Skip to content

A model-driven approach to building AI agents in just a few lines of code.

License

Notifications You must be signed in to change notification settings

roeetal/sdk-python

Repository files navigation

Strands Agents

Strands Agents

A model-driven approach to building AI agents in just a few lines of code.

GitHub commit activityGitHub open issuesGitHub open pull requestsLicensePyPI versionPython versions

DocumentationSamplesPython SDKToolsAgent BuilderMCP Server

Strands Agents is a simple yet powerful SDK that takes a model-driven approach to building and running AI agents. From simple conversational assistants to complex autonomous workflows, from local development to production deployment, Strands Agents scales with your needs.

Feature Overview

  • Lightweight & Flexible: Simple agent loop that just works and is fully customizable
  • Model Agnostic: Support for Amazon Bedrock, Anthropic, Gemini, LiteLLM, Llama, Ollama, OpenAI, Writer, and custom providers
  • Advanced Capabilities: Multi-agent systems, autonomous agents, and streaming support
  • Built-in MCP: Native support for Model Context Protocol (MCP) servers, enabling access to thousands of pre-built tools

Quick Start

# Install Strands Agents pip install strands-agents strands-agents-tools
fromstrandsimportAgentfromstrands_toolsimportcalculatoragent=Agent(tools=[calculator]) agent("What is the square root of 1764")

Note: For the default Amazon Bedrock model provider, you'll need AWS credentials configured and model access enabled for Claude 4 Sonnet in the us-west-2 region. See the Quickstart Guide for details on configuring other model providers.

Installation

Ensure you have Python 3.10+ installed, then:

# Create and activate virtual environment python -m venv .venv source .venv/bin/activate # On Windows use: .venv\Scripts\activate# Install Strands and tools pip install strands-agents strands-agents-tools

Features at a Glance

Python-Based Tools

Easily build tools using Python decorators:

fromstrandsimportAgent, tool@tooldefword_count(text: str) ->int: """Count words in text. This docstring is used by the LLM to understand the tool's purpose. """returnlen(text.split()) agent=Agent(tools=[word_count]) response=agent("How many words are in this sentence?")

Hot Reloading from Directory: Enable automatic tool loading and reloading from the ./tools/ directory:

fromstrandsimportAgent# Agent will watch ./tools/ directory for changesagent=Agent(load_tools_from_directory=True) response=agent("Use any tools you find in the tools directory")

MCP Support

Seamlessly integrate Model Context Protocol (MCP) servers:

fromstrandsimportAgentfromstrands.tools.mcpimportMCPClientfrommcpimportstdio_client, StdioServerParametersaws_docs_client=MCPClient( lambda: stdio_client(StdioServerParameters(command="uvx", args=["awslabs.aws-documentation-mcp-server@latest"])) ) withaws_docs_client: agent=Agent(tools=aws_docs_client.list_tools_sync()) response=agent("Tell me about Amazon Bedrock and how to use it with Python")

Multiple Model Providers

Support for various model providers:

fromstrandsimportAgentfromstrands.modelsimportBedrockModelfromstrands.models.ollamaimportOllamaModelfromstrands.models.llamaapiimportLlamaAPIModelfromstrands.models.geminiimportGeminiModelfromstrands.models.llamacppimportLlamaCppModel# Bedrockbedrock_model=BedrockModel( model_id="us.amazon.nova-pro-v1:0", temperature=0.3, streaming=True, # Enable/disable streaming ) agent=Agent(model=bedrock_model) agent("Tell me about Agentic AI") # Google Geminigemini_model=GeminiModel( client_args={"api_key": "your_gemini_api_key", }, model_id="gemini-2.5-flash", params={"temperature": 0.7} ) agent=Agent(model=gemini_model) agent("Tell me about Agentic AI") # Ollamaollama_model=OllamaModel( host="http://localhost:11434", model_id="llama3" ) agent=Agent(model=ollama_model) agent("Tell me about Agentic AI") # Llama APIllama_model=LlamaAPIModel( model_id="Llama-4-Maverick-17B-128E-Instruct-FP8", ) agent=Agent(model=llama_model) response=agent("Tell me about Agentic AI")

Built-in providers:

Custom providers can be implemented using Custom Providers

Example tools

Strands offers an optional strands-agents-tools package with pre-built tools for quick experimentation:

fromstrandsimportAgentfromstrands_toolsimportcalculatoragent=Agent(tools=[calculator]) agent("What is the square root of 1764")

It's also available on GitHub via strands-agents/tools.

Documentation

For detailed guidance & examples, explore our documentation:

Contributing ❤️

We welcome contributions! See our Contributing Guide for details on:

  • Reporting bugs & features
  • Development setup
  • Contributing via Pull Requests
  • Code of Conduct
  • Reporting of security issues

License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

Security

See CONTRIBUTING for more information.

About

A model-driven approach to building AI agents in just a few lines of code.

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python100.0%