-
Notifications
You must be signed in to change notification settings - Fork 4.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feat: IBM watsonx.ai llm and embeddings integration #13600
base: main
Are you sure you want to change the base?
Conversation
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
@Mateusz-Switala Looks good. Let's switch to review and start the thread with the community. |
description = "llama-index llms IBM watsonx.ai integration" | ||
exclude = ["**/BUILD"] | ||
license = "MIT" | ||
name = "llama-index-llms-ibm" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we already have a Watson integration in llama-index-integrations/llms/llama-index-llms-watsonx
-- is this different?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@logan-markewich Changes contain both LLM and Embeddings integration. Currently, there exist close LLM integration in llamaindex to the proposed one: WatsonX https://docs.llamaindex.ai/en/stable/examples/llm/watsonx/ . However, since the mentioned integration uses old IBM python client sdk (i.e. ibm-watson-machine-learning), does not contain fully functionality of watsonx.ai service (i.a. deployment inferencing) and is not consistent with the IBM watsonx.ai other open-source integrations (e.g. LangChain) we decided to introduce a new package: llama-index-llms-ibm
. Making the proposed changes in llama-index-llms-watsonx
package instead of creating a new integration package may cause a few breaking changes. Simultaneously, we add deprecation warnings to the current llm WatsonX integration and change documentation accordingly.
Please note that this would be an official IBM watsonx.ai integration. Also, if the PR will be positively reviewed quickly, please DO NOT merge yet.
...index-integrations/embeddings/llama-index-embeddings-ibm/llama_index/embeddings/ibm/utils.py
Fixed
Show fixed
Hide fixed
llama-index-integrations/llms/llama-index-llms-ibm/llama_index/llms/ibm/utils.py
Fixed
Show fixed
Hide fixed
@logan-markewich Changes are ready for official review. |
...index-integrations/embeddings/llama-index-embeddings-ibm/llama_index/embeddings/ibm/utils.py
Fixed
Show fixed
Hide fixed
llama-index-integrations/llms/llama-index-llms-ibm/llama_index/llms/ibm/utils.py
Fixed
Show fixed
Hide fixed
) | ||
|
||
parsed_url = urllib.parse.urlparse(creds["url"].get_secret_value()) | ||
if parsed_url.netloc.endswith("cloud.ibm.com"): |
Check failure
Code scanning / CodeQL
Incomplete URL substring sanitization High
cloud.ibm.com
) | ||
|
||
parsed_url = urllib.parse.urlparse(creds["url"].get_secret_value()) | ||
if parsed_url.netloc.endswith("cloud.ibm.com"): |
Check failure
Code scanning / CodeQL
Incomplete URL substring sanitization High
Description
Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.
Summary:
The aim of this PR is to introduce the official IBM watsonx.ai integration. Changes contains llm class
WatsonxLLM
that extendllama_index.core.llms.custom.customLLM
and embeddings classWatsonxEmbeddings
that extendllama_index.core.base.embeddings.base.Baseembeddings
. Under the hood, integration uses components ofibm-watsonx-ai
package and is aligned with the recent changes in ibm-watsonx-ai sdk.Currently, there exist close llm integration in llamaindex to the proposed one: WatsonX https://docs.llamaindex.ai/en/stable/examples/llm/watsonx/ . However, since the mentioned integration uses old IBM python client sdk (i.e. ibm-watson-machine-learning), does not contain fully functionality of watsonx.ai service (i.a. deployment inferencing) and is not consistent with IBM watsonx.ai other open-source integrations (e.g. LangChain) we decided to introduce a new package:
llama-index-llms-ibm
. Simultaneously, we add deprecation warnings to the current llm WatsonX integration and change documentation accordingly.fyi @LukaszCmielowski @acasaletto
New Package?
Did I fill in the
tool.llamahub
section in thepyproject.toml
and provide a detailed README.md for my new integration or package?Version Bump?
Did I bump the version in the
pyproject.toml
file of the package I am updating? (Except for thellama-index-core
package)Type of Change
Please delete options that are not relevant.
How Has This Been Tested?
Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration
Suggested Checklist:
make format; make lint
to appease the lint gods