Comet LLM releases¶
The following is a history of released comet_llm
versions. It does not list everything that was changed in a release, but does mention the highlights and all public-facing additions, changes, and deprecations.
You can install any one of the following released version numbers. The full list of releases is at Python Package Index.
Release 1.6.0¶
Release date: Nov 9, 2023
- Added support for openai v1
Release 1.5.0¶
Release date: Nov 6, 2023
- Introduced new logic and config variable(COMET_RAISE_EXCEPTIONS_OR_ERROR) for errors related to API key.
Release 1.4.1¶
Release date: Oct 6, 2023
- Fixed issue with packaging that meant version 1.4.0 couldn't be imported
Release 1.4.0¶
Release date: Oct 6, 2023
note: This version had a packaging issue and can't be imported, please use version 1.4.1 instead
- Added OpenAI auto-logger
- Added debug logging mode
- Now an error is logged if trying to log to a non-LLM project
Release 1.3.0¶
Release date: Sep 7, 2023
- Introduced
comet_llm.is_ready()
that can signalize if everything is ready for starting logging data - Introduced disabled mode for
comet-llm
api. In this mode all calls to prompt- and chain-logging API do nothing.
Release 1.2.0¶
Release date: Sep 1, 2023
- Added multi-thread support for chains. Now every thread might have it's own "global" chain. It means that now
start_chain
,end_chain
andSpan
calls refer to different chains if called in different threads. It makes it possible to execute multiple chains in parallel out of the box (but 1 per thread) without problems with chain consistency. - Added examples folder with jupyter notebook and readme files.
Dec. 19, 2023