Releases: openai/openai-agents-python
Releases · openai/openai-agents-python
v0.1.0
Please note that this version includes a small breaking change to MCP servers, hence the minor version bump. See https://openai.github.io/openai-agents-python/release/ for the breaking change changelog.
What's Changed
- Add is_enabled to handoffs by @rm-openai in #925
- Removed lines to avoid duplicate output in REPL utility by @DanieleMorotti in #928
- Bugfix | Fixed a bug when calling reasoning models with
store=False
by @niv-hertz in #920 - Point CLAUDE.md to AGENTS.md by @rm-openai in #930
- Fix #890 by adjusting the guardrail document page by @seratch in #903
- Add exempt-issue-labels to the issue triage GH Action job by @seratch in #936
- Remove duplicate entry from
__init__.py
by @Lightblues in #897 - Fix #604 Chat Completion model raises runtime error when response.choices is empty by @seratch in #935
- Fix #892 by adding proper tool description in context.md by @Abbas-Asad in #937
- Duplicate Classifier in pyproject.toml by @DanielHashmi in #951
- fix: add ensure_ascii=False to json.dumps for correct Unicode output by @KatHaruto in #639
- Add uv as an alternative Python environment setup option for issue #884 by @Sucran in #909
- Fix and Document
parallel_tool_calls
Attribute in ModelSettings by @Rehan-Ul-Haq in #763 - replace .py file with .ipynb for Jupyter example by @ccmien in #262
- feat: add MCP tool filtering support by @devtalker in #861
- Tweak in pyproject.toml by @seratch in #952
- Fix incorrect argument description in
on_trace_end
docstring by @shirazkk in #958 - Fix #944 by updating the models document page to cover extra_args by @seratch in #950
- Annotating the openai.Omit type so that ModelSettings can be serialized by pydantic by @tconley1428 in #938
- Import Path Inconsistency by @DanielHashmi in #960
- Add reasoning content - fix on #494 by @axion66 in #871
- Add safety check handling for ComputerTool by @rm-openai in #923
- v0.1.0 by @rm-openai in #963
New Contributors
- @Lightblues made their first contribution in #897
- @Abbas-Asad made their first contribution in #937
- @DanielHashmi made their first contribution in #951
- @KatHaruto made their first contribution in #639
- @Sucran made their first contribution in #909
- @devtalker made their first contribution in #861
- @shirazkk made their first contribution in #958
- @tconley1428 made their first contribution in #938
- @axion66 made their first contribution in #871
Full Changelog: v0.0.19...v0.1.0
v0.0.19
What's Changed
- Make Runner an abstract base class by @pakrym-oai in #720
- Prepare 0.0.19 release by @pakrym-oai in #895
Full Changelog: v0.0.18...v0.0.19
v0.0.18
Key changes
- Added support for dynamic prompt templates via the OpenAI Prompts feature
- REPL support
- Bug fixes
What's Changed
- Add REPL run_demo_loop helper by @rm-openai in #811
- Crosslink to js/ts by @rm-openai in #815
- Add release documentation by @rm-openai in #814
- Fix handoff transfer message JSON by @jhills20 in #818
- docs: custom output extraction by @jleguina in #817
- Added support for passing tool_call_id via the RunContextWrapper by @niv-hertz in #766
- Allow arbitrary kwargs in model by @rm-openai in #842
- Fix function_schema name override bug by @rm-openai in #872
- adopted float instead of timedelta for timeout parameters by @DanieleMorotti in #874
- Prompts support by @rm-openai in #876
- v0.0.18 by @rm-openai in #878
New Contributors
- @jleguina made their first contribution in #817
- @niv-hertz made their first contribution in #766
Full Changelog: v0.0.17...v0.0.18
v0.0.17
What's Changed
- fix Gemini token validation issue with LiteLLM by @handrew in #735
- Fix visualization recursion with cycle detection by @rm-openai in #737
- Update MCP and tool docs by @rm-openai in #736
- Fix Gemini API content filter handling by @rm-openai in #746
- Add Portkey AI as a tracing provider by @siddharthsambharia-portkey in #785
- Added RunErrorDetails object for MaxTurnsExceeded exception by @DanieleMorotti in #743
- Fixed Python syntax by @sarmadgulzar in #665
- Small fix for litellm model by @robtinn in #789
- Fix typo in assertion message for handoff function by @Rehan-Ul-Haq in #780
- Fix typo: Replace 'two' with 'three' in /docs/mcp.md by @luochang212 in #757
- Update input_guardrails.py by @venkatnaveen7 in #774
- docs: fix typo in docstring for is_strict_json_schema method by @Rehan-Ul-Haq in #775
- Add comment to handoff_occured misspelling by @rm-openai in #792
- Fix #777 by handling MCPCall events in RunImpl by @seratch in #799
- Ensure item.model_dump only contains JSON serializable types by @westhood in #801
- Don't cache agent tools during a run by @rm-openai in #803
- Only start tracing worker thread on first span/trace by @rm-openai in #804
- Add is_enabled to FunctionTool by @rm-openai in #808
- v0.0.17 by @rm-openai in #809
New Contributors
- @siddharthsambharia-portkey made their first contribution in #785
- @sarmadgulzar made their first contribution in #665
- @robtinn made their first contribution in #789
- @Rehan-Ul-Haq made their first contribution in #780
- @luochang212 made their first contribution in #757
- @venkatnaveen7 made their first contribution in #774
- @westhood made their first contribution in #801
Full Changelog: v0.0.16...v0.0.17
v0.0.16
Biggest change is support for new hosted tools - remote MCP, code interpreter, image generator and local shell.
What's Changed
- Create AGENTS.md by @dkundel-openai in #707
- Added mcp 'instructions' attribute to the server by @DanieleMorotti in #706
- Add Galileo to external tracing processors list by @franz101 in #662
- DRAFT: Dev/add usage details to Usage class by @WJPBProjects in #726
- Upgrade openAI sdk version by @rm-openai in #730
- Hosted MCP support by @rm-openai in #731
- Add support for local shell, image generator, code interpreter tools by @rm-openai in #732
- v0.0.16 by @rm-openai in #733
New Contributors
- @franz101 made their first contribution in #662
- @WJPBProjects made their first contribution in #726
Full Changelog: v0.0.15...v0.0.16
v0.0.15
Main big change is support for Streamable HTTP in MCP servers.
What's Changed
- Fixed a bug for "detail" attribute in input image by @DanieleMorotti in #685
- feat: pass extra_body through to LiteLLM acompletion by @AshokSaravanan222 in #638
- Update search_agent.py by @leohpark in #677
- feat: Streamable HTTP support by @Akshit97 in #643
- v0.0.15 by @rm-openai in #701
New Contributors
- @AshokSaravanan222 made their first contribution in #638
- @leohpark made their first contribution in #677
- @Akshit97 made their first contribution in #643
Full Changelog: v0.0.14...v0.0.15
v0.0.14
What's Changed
- Add usage to context in streaming by @rm-openai in #595
- Make the TTS voices type exportable by @mangiucugna in #577
- docs: add FutureAGI to tracing documentation by @NVJKKartik in #592
- Update litellm version by @pakrym-oai in #626
- 0.0.14 release by @pakrym-oai in #635
New Contributors
- @mangiucugna made their first contribution in #577
- @NVJKKartik made their first contribution in #592
Full Changelog: v0.0.13...v0.0.14
v0.0.13
What's Changed
- Adding extra_headers parameters to ModelSettings by @jonnyk20 in #550
- Examples: Fix financial_research_agent instructions by @seratch in #573
- Allow cancel out of the streaming result by @handrew in #579
- Create to_json_dict for ModelSettings by @rm-openai in #582
- Prevent MCP ClientSession hang by @njbrake in #580
- Fix stream error using LiteLLM by @DanieleMorotti in #589
- More tests for cancelling streamed run by @rm-openai in #590
- v0.0.13 by @rm-openai in #593
New Contributors
- @jonnyk20 made their first contribution in #550
- @handrew made their first contribution in #579
- @njbrake made their first contribution in #580
Full Changelog: v0.0.12...v0.0.13
v0.0.12
Key changes:
- Use any model via litellm:
Agent(model="litellm/<provider>/<model_name>")
e.g.Agent(model="litellm/anthropic/claude-3-5-sonnet-20240620")
- Enable more complex output types on Agents
What's Changed
- Run CI on all commits, not just ones on main by @rm-openai in #521
- Extract chat completions conversion code into helper by @rm-openai in #522
- Extract chat completions streaming helpers by @rm-openai in #523
- Show repo name/data in docs by @rm-openai in #525
- Litellm integration by @rm-openai in #524
- Docs for LiteLLM integration by @rm-openai in #532
- Docs: Switch to o3 model; exclude translated pages from search by @seratch in #533
- Examples for image inputs by @rm-openai in #553
- Enable non-strict output types by @rm-openai in #539
- Start and finish streaming trace in impl metod by @rm-openai in #540
- Fix visualize graph filename to without extension. by @yuya-haruna in #554
- RFC: automatically use litellm if possible by @rm-openai in #534
- Docs and tests for litellm by @rm-openai in #561
- Pass through organization/project headers to tracing backend, fix speech_group enum by @stevenheidel in #562
- v0.0.12 by @rm-openai in #564
New Contributors
- @yuya-haruna made their first contribution in #554
Full Changelog: v0.0.11...v0.0.12
v0.0.11
What's Changed
- Examples and tests for previous_response_id by @rm-openai in #512
- Only include stream_options when streaming by @rm-openai in #519
- v0.0.11 by @rm-openai in #520
Full Changelog: v0.0.10...v0.0.11