|
17 | 17 | "source": [
|
18 | 18 | "# ChatOCIGenAI\n",
|
19 | 19 | "\n",
|
20 |
| - "This notebook provides a quick overview for getting started with OCIGenAI [chat models](/docs/concepts/chat_models). For detailed documentation of all ChatOCIGenAI features and configurations head to the [API reference](https://python.langchain.com/api_reference/community/chat_models/langchain_community.chat_models.oci_generative_ai.ChatOCIGenAI.html).\n", |
| 20 | + "This notebook provides a quick overview for getting started with OCIGenAI [chat models](/docs/concepts/chat_models). For detailed documentation of all ChatOCIGenAI features and configurations head to the [API reference](https://pypi.org/project/langchain-oci/).\n", |
21 | 21 | "\n",
|
22 | 22 | "Oracle Cloud Infrastructure (OCI) Generative AI is a fully managed service that provides a set of state-of-the-art, customizable large language models (LLMs) that cover a wide range of use cases, and which is available through a single API.\n",
|
23 | 23 | "Using the OCI Generative AI service you can access ready-to-use pretrained models, or create and host your own fine-tuned custom models based on your own data on dedicated AI clusters. Detailed documentation of the service and API is available __[here](https://docs.oracle.com/en-us/iaas/Content/generative-ai/home.htm)__ and __[here](https://docs.oracle.com/en-us/iaas/api/#/en/generative-ai/20231130/)__.\n",
|
|
26 | 26 | "## Overview\n",
|
27 | 27 | "### Integration details\n",
|
28 | 28 | "\n",
|
29 |
| - "| Class | Package | Local | Serializable | [JS support](https://js.langchain.com/docs/integrations/chat/oci_generative_ai) |\n", |
30 |
| - "| :--- | :--- | :---: | :---: | :---: |\n", |
31 |
| - "| [ChatOCIGenAI](https://python.langchain.com/api_reference/community/chat_models/langchain_community.chat_models.oci_generative_ai.ChatOCIGenAI.html) | [langchain-community](https://python.langchain.com/api_reference/community/index.html) | ❌ | ❌ | ❌ |\n", |
| 29 | + "| Class | Package | Local | Serializable | [JS support](https://js.langchain.com/docs/integrations/chat/oci_generative_ai) |\n", |
| 30 | + "| :--- |:---------------------------------------------------------------------------------| :---: | :---: | :---: |\n", |
| 31 | + "| [ChatOCIGenAI](https://python.langchain.com/api_reference/community/chat_models/langchain_community.chat_models.oci_generative_ai.ChatOCIGenAI.html) | [langchain-oci](https://github.com/oracle/langchain-oracle) | ❌ | ❌ | ❌ |\n", |
32 | 32 | "\n",
|
33 | 33 | "### Model features\n",
|
34 | 34 | "| [Tool calling](/docs/how_to/tool_calling/) | [Structured output](/docs/how_to/structured_output/) | [JSON mode](/docs/how_to/structured_output/#advanced-specifying-the-method-for-structuring-outputs) | [Image input](/docs/how_to/multimodal_inputs/) | Audio input | Video input | [Token-level streaming](/docs/how_to/chat_streaming/) | Native async | [Token usage](/docs/how_to/chat_token_usage_tracking/) | [Logprobs](/docs/how_to/logprobs/) |\n",
|
|
37 | 37 | "\n",
|
38 | 38 | "## Setup\n",
|
39 | 39 | "\n",
|
40 |
| - "To access OCIGenAI models you'll need to install the `oci` and `langchain-community` packages.\n", |
| 40 | + "To access OCIGenAI models you'll need to install the `oci` and `langchain-oci` packages.\n", |
41 | 41 | "\n",
|
42 | 42 | "### Credentials\n",
|
43 | 43 | "\n",
|
|
84 | 84 | "outputs": [],
|
85 | 85 | "source": [
|
86 | 86 | "from langchain_oci.chat_models import ChatOCIGenAI\n",
|
87 |
| - "from langchain_core.messages import AIMessage, HumanMessage, SystemMessage\n", |
88 | 87 | "\n",
|
89 | 88 | "chat = ChatOCIGenAI(\n",
|
90 |
| - " model_id=\"cohere.command-r-16k\",\n", |
| 89 | + " model_id=\"cohere.command-r-plus-08-2024\",\n", |
91 | 90 | " service_endpoint=\"https://inference.generativeai.us-chicago-1.oci.oraclecloud.com\",\n",
|
92 |
| - " compartment_id=\"MY_OCID\",\n", |
93 |
| - " model_kwargs={\"temperature\": 0.7, \"max_tokens\": 500},\n", |
| 91 | + " compartment_id=\"compartment_id\",\n", |
| 92 | + " model_kwargs={\"temperature\": 0, \"max_tokens\": 500},\n", |
| 93 | + " auth_type=\"SECURITY_TOKEN\",\n", |
| 94 | + " auth_profile=\"auth_profile_name\",\n", |
| 95 | + " auth_file_location=\"auth_file_location\",\n", |
94 | 96 | ")"
|
95 | 97 | ]
|
96 | 98 | },
|
|
110 | 112 | "tags": []
|
111 | 113 | },
|
112 | 114 | "outputs": [],
|
113 |
| - "source": [ |
114 |
| - "messages = [\n", |
115 |
| - " SystemMessage(content=\"your are an AI assistant.\"),\n", |
116 |
| - " AIMessage(content=\"Hi there human!\"),\n", |
117 |
| - " HumanMessage(content=\"tell me a joke.\"),\n", |
118 |
| - "]\n", |
119 |
| - "response = chat.invoke(messages)" |
120 |
| - ] |
| 115 | + "source": "response = chat.invoke(\"Tell me one fact about Earth\")" |
121 | 116 | },
|
122 | 117 | {
|
123 | 118 | "cell_type": "code",
|
|
146 | 141 | "metadata": {},
|
147 | 142 | "outputs": [],
|
148 | 143 | "source": [
|
149 |
| - "from langchain_core.prompts import ChatPromptTemplate\n", |
150 |
| - "\n", |
151 |
| - "prompt = ChatPromptTemplate.from_template(\"Tell me a joke about {topic}\")\n", |
152 |
| - "chain = prompt | chat\n", |
| 144 | + "from langchain_core.prompts import PromptTemplate\n", |
| 145 | + "from langchain_oci.chat_models import ChatOCIGenAI\n", |
153 | 146 | "\n",
|
154 |
| - "response = chain.invoke({\"topic\": \"dogs\"})\n", |
155 |
| - "print(response.content)" |
| 147 | + "llm = ChatOCIGenAI(\n", |
| 148 | + " model_id=\"cohere.command-r-plus-08-2024\",\n", |
| 149 | + " service_endpoint=\"https://inference.generativeai.us-chicago-1.oci.oraclecloud.com\",\n", |
| 150 | + " compartment_id=\"compartment_id\",\n", |
| 151 | + " model_kwargs={\"temperature\": 0, \"max_tokens\": 500},\n", |
| 152 | + " auth_type=\"SECURITY_TOKEN\",\n", |
| 153 | + " auth_profile=\"auth_profile_name\",\n", |
| 154 | + " auth_file_location=\"auth_file_location\",\n", |
| 155 | + ")\n", |
| 156 | + "prompt = PromptTemplate(input_variables=[\"query\"], template=\"{query}\")\n", |
| 157 | + "llm_chain = prompt | llm\n", |
| 158 | + "response = llm_chain.invoke(\"what is the capital of france?\")\n", |
| 159 | + "print(response)" |
156 | 160 | ]
|
157 | 161 | },
|
158 | 162 | {
|
|
162 | 166 | "source": [
|
163 | 167 | "## API reference\n",
|
164 | 168 | "\n",
|
165 |
| - "For detailed documentation of all ChatOCIGenAI features and configurations head to the API reference: https://python.langchain.com/api_reference/community/chat_models/langchain_community.chat_models.oci_generative_ai.ChatOCIGenAI.html" |
| 169 | + "For detailed documentation of all ChatOCIGenAI features and configurations head to the API reference: https://pypi.org/project/langchain-oci/" |
166 | 170 | ]
|
167 | 171 | }
|
168 | 172 | ],
|
|
0 commit comments