-
Notifications
You must be signed in to change notification settings - Fork 15.9k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
partners: add xAI chat integration (#28032)
- Loading branch information
Showing
26 changed files
with
3,378 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,332 @@ | ||
{ | ||
"cells": [ | ||
{ | ||
"cell_type": "raw", | ||
"id": "afaf8039", | ||
"metadata": {}, | ||
"source": [ | ||
"---\n", | ||
"sidebar_label: xAI\n", | ||
"---" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "e49f1e0d", | ||
"metadata": {}, | ||
"source": [ | ||
"# ChatXAI\n", | ||
"\n", | ||
"\n", | ||
"This page will help you get started with xAI [chat models](../../concepts/chat_models.mdx). For detailed documentation of all `ChatXAI` features and configurations head to the [API reference](https://python.langchain.com/api_reference/xai/chat_models/langchain_xai.chat_models.ChatXAI.html).\n", | ||
"\n", | ||
"[xAI](https://console.x.ai/) offers an API to interact with Grok models.\n", | ||
"\n", | ||
"## Overview\n", | ||
"### Integration details\n", | ||
"\n", | ||
"| Class | Package | Local | Serializable | [JS support](https://js.langchain.com/docs/integrations/chat/xai) | Package downloads | Package latest |\n", | ||
"| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n", | ||
"| [ChatXAI](https://python.langchain.com/api_reference/xai/chat_models/langchain_xai.chat_models.ChatXAI.html) | [langchain-xai](https://python.langchain.com/api_reference/xai/index.html) | ❌ | beta | ✅ | ![PyPI - Downloads](https://img.shields.io/pypi/dm/langchain-xai?style=flat-square&label=%20) | ![PyPI - Version](https://img.shields.io/pypi/v/langchain-xai?style=flat-square&label=%20) |\n", | ||
"\n", | ||
"### Model features\n", | ||
"| [Tool calling](../../how_to/tool_calling.ipynb) | [Structured output](../../how_to/structured_output.ipynb) | JSON mode | [Image input](../../how_to/multimodal_inputs.ipynb) | Audio input | Video input | [Token-level streaming](../../how_to/chat_streaming.ipynb) | Native async | [Token usage](../../how_to/chat_token_usage_tracking.ipynb) | [Logprobs](../../how_to/logprobs.ipynb) |\n", | ||
"| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |\n", | ||
"| ✅ | ✅ | ❌ | ❌ | ❌ | ❌ | ✅ | ❌ | ✅ | ✅ | \n", | ||
"\n", | ||
"## Setup\n", | ||
"\n", | ||
"To access xAI models you'll need to create an xAI account, get an API key, and install the `langchain-xai` integration package.\n", | ||
"\n", | ||
"### Credentials\n", | ||
"\n", | ||
"Head to [this page](https://console.x.ai/) to sign up for xAI and generate an API key. Once you've done this set the `XAI_API_KEY` environment variable:" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"id": "433e8d2b-9519-4b49-b2c4-7ab65b046c94", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"import getpass\n", | ||
"import os\n", | ||
"\n", | ||
"if \"XAI_API_KEY\" not in os.environ:\n", | ||
" os.environ[\"XAI_API_KEY\"] = getpass.getpass(\"Enter your xAI API key: \")" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "72ee0c4b-9764-423a-9dbf-95129e185210", | ||
"metadata": {}, | ||
"source": [ | ||
"If you want to get automated tracing of your model calls you can also set your [LangSmith](https://docs.smith.langchain.com/) API key by uncommenting below:" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 2, | ||
"id": "a15d341e-3e26-4ca3-830b-5aab30ed66de", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"# os.environ[\"LANGSMITH_API_KEY\"] = getpass.getpass(\"Enter your LangSmith API key: \")\n", | ||
"# os.environ[\"LANGSMITH_TRACING\"] = \"true\"" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "0730d6a1-c893-4840-9817-5e5251676d5d", | ||
"metadata": {}, | ||
"source": [ | ||
"### Installation\n", | ||
"\n", | ||
"The LangChain xAI integration lives in the `langchain-xai` package:" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 3, | ||
"id": "652d6238-1f87-422a-b135-f5abbb8652fc", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"%pip install -qU langchain-xai" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "a38cde65-254d-4219-a441-068766c0d4b5", | ||
"metadata": {}, | ||
"source": [ | ||
"## Instantiation\n", | ||
"\n", | ||
"Now we can instantiate our model object and generate chat completions:" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 4, | ||
"id": "cb09c344-1836-4e0c-acf8-11d13ac1dbae", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"from langchain_xai import ChatXAI\n", | ||
"\n", | ||
"llm = ChatXAI(\n", | ||
" model=\"grok-beta\",\n", | ||
" temperature=0,\n", | ||
" max_tokens=None,\n", | ||
" timeout=None,\n", | ||
" max_retries=2,\n", | ||
" # other params...\n", | ||
")" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "2b4f3e15", | ||
"metadata": {}, | ||
"source": [ | ||
"## Invocation" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 5, | ||
"id": "62e0dbc3", | ||
"metadata": { | ||
"tags": [] | ||
}, | ||
"outputs": [ | ||
{ | ||
"data": { | ||
"text/plain": [ | ||
"AIMessage(content=\"J'adore programmer.\", additional_kwargs={'refusal': None}, response_metadata={'token_usage': {'completion_tokens': 6, 'prompt_tokens': 30, 'total_tokens': 36, 'completion_tokens_details': None, 'prompt_tokens_details': None}, 'model_name': 'grok-beta', 'system_fingerprint': 'fp_14b89b2dfc', 'finish_reason': 'stop', 'logprobs': None}, id='run-adffb7a3-e48a-4f52-b694-340d85abe5c3-0', usage_metadata={'input_tokens': 30, 'output_tokens': 6, 'total_tokens': 36, 'input_token_details': {}, 'output_token_details': {}})" | ||
] | ||
}, | ||
"execution_count": 5, | ||
"metadata": {}, | ||
"output_type": "execute_result" | ||
} | ||
], | ||
"source": [ | ||
"messages = [\n", | ||
" (\n", | ||
" \"system\",\n", | ||
" \"You are a helpful assistant that translates English to French. Translate the user sentence.\",\n", | ||
" ),\n", | ||
" (\"human\", \"I love programming.\"),\n", | ||
"]\n", | ||
"ai_msg = llm.invoke(messages)\n", | ||
"ai_msg" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 6, | ||
"id": "d86145b3-bfef-46e8-b227-4dda5c9c2705", | ||
"metadata": {}, | ||
"outputs": [ | ||
{ | ||
"name": "stdout", | ||
"output_type": "stream", | ||
"text": [ | ||
"J'adore programmer.\n" | ||
] | ||
} | ||
], | ||
"source": [ | ||
"print(ai_msg.content)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "18e2bfc0-7e78-4528-a73f-499ac150dca8", | ||
"metadata": {}, | ||
"source": [ | ||
"## Chaining\n", | ||
"\n", | ||
"We can [chain](../../how_to/sequence.ipynb) our model with a prompt template like so:" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 7, | ||
"id": "e197d1d7-a070-4c96-9f8a-a0e86d046e0b", | ||
"metadata": {}, | ||
"outputs": [ | ||
{ | ||
"data": { | ||
"text/plain": [ | ||
"AIMessage(content='Ich liebe das Programmieren.', additional_kwargs={'refusal': None}, response_metadata={'token_usage': {'completion_tokens': 7, 'prompt_tokens': 25, 'total_tokens': 32, 'completion_tokens_details': None, 'prompt_tokens_details': None}, 'model_name': 'grok-beta', 'system_fingerprint': 'fp_14b89b2dfc', 'finish_reason': 'stop', 'logprobs': None}, id='run-569fc8dc-101b-4e6d-864e-d4fa80df2b63-0', usage_metadata={'input_tokens': 25, 'output_tokens': 7, 'total_tokens': 32, 'input_token_details': {}, 'output_token_details': {}})" | ||
] | ||
}, | ||
"execution_count": 7, | ||
"metadata": {}, | ||
"output_type": "execute_result" | ||
} | ||
], | ||
"source": [ | ||
"from langchain_core.prompts import ChatPromptTemplate\n", | ||
"\n", | ||
"prompt = ChatPromptTemplate.from_messages(\n", | ||
" [\n", | ||
" (\n", | ||
" \"system\",\n", | ||
" \"You are a helpful assistant that translates {input_language} to {output_language}.\",\n", | ||
" ),\n", | ||
" (\"human\", \"{input}\"),\n", | ||
" ]\n", | ||
")\n", | ||
"\n", | ||
"chain = prompt | llm\n", | ||
"chain.invoke(\n", | ||
" {\n", | ||
" \"input_language\": \"English\",\n", | ||
" \"output_language\": \"German\",\n", | ||
" \"input\": \"I love programming.\",\n", | ||
" }\n", | ||
")" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "e074bce1-0994-4b83-b393-ae7aa7e21750", | ||
"metadata": {}, | ||
"source": [ | ||
"## Tool calling\n", | ||
"\n", | ||
"ChatXAI has a [tool calling](https://docs.x.ai/docs#capabilities) (we use \"tool calling\" and \"function calling\" interchangeably here) API that lets you describe tools and their arguments, and have the model return a JSON object with a tool to invoke and the inputs to that tool. Tool-calling is extremely useful for building tool-using chains and agents, and for getting structured outputs from models more generally.\n", | ||
"\n", | ||
"### ChatXAI.bind_tools()\n", | ||
"\n", | ||
"With `ChatXAI.bind_tools`, we can easily pass in Pydantic classes, dict schemas, LangChain tools, or even functions as tools to the model. Under the hood these are converted to an OpenAI tool schemas, which looks like:\n", | ||
"```\n", | ||
"{\n", | ||
" \"name\": \"...\",\n", | ||
" \"description\": \"...\",\n", | ||
" \"parameters\": {...} # JSONSchema\n", | ||
"}\n", | ||
"```\n", | ||
"and passed in every model invocation." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 8, | ||
"id": "c6bfe929-ec02-46bd-9d54-76350edddabc", | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"from pydantic import BaseModel, Field\n", | ||
"\n", | ||
"\n", | ||
"class GetWeather(BaseModel):\n", | ||
" \"\"\"Get the current weather in a given location\"\"\"\n", | ||
"\n", | ||
" location: str = Field(..., description=\"The city and state, e.g. San Francisco, CA\")\n", | ||
"\n", | ||
"\n", | ||
"llm_with_tools = llm.bind_tools([GetWeather])" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 9, | ||
"id": "5265c892-d8c2-48af-aef5-adbee1647ba6", | ||
"metadata": {}, | ||
"outputs": [ | ||
{ | ||
"data": { | ||
"text/plain": [ | ||
"AIMessage(content='I am retrieving the current weather for San Francisco.', additional_kwargs={'tool_calls': [{'id': '0', 'function': {'arguments': '{\"location\":\"San Francisco, CA\"}', 'name': 'GetWeather'}, 'type': 'function'}], 'refusal': None}, response_metadata={'token_usage': {'completion_tokens': 11, 'prompt_tokens': 151, 'total_tokens': 162, 'completion_tokens_details': None, 'prompt_tokens_details': None}, 'model_name': 'grok-beta', 'system_fingerprint': 'fp_14b89b2dfc', 'finish_reason': 'tool_calls', 'logprobs': None}, id='run-73707da7-afec-4a52-bee1-a176b0ab8585-0', tool_calls=[{'name': 'GetWeather', 'args': {'location': 'San Francisco, CA'}, 'id': '0', 'type': 'tool_call'}], usage_metadata={'input_tokens': 151, 'output_tokens': 11, 'total_tokens': 162, 'input_token_details': {}, 'output_token_details': {}})" | ||
] | ||
}, | ||
"execution_count": 9, | ||
"metadata": {}, | ||
"output_type": "execute_result" | ||
} | ||
], | ||
"source": [ | ||
"ai_msg = llm_with_tools.invoke(\n", | ||
" \"what is the weather like in San Francisco\",\n", | ||
")\n", | ||
"ai_msg" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"id": "3a5bb5ca-c3ae-4a58-be67-2cd18574b9a3", | ||
"metadata": {}, | ||
"source": [ | ||
"## API reference\n", | ||
"\n", | ||
"For detailed documentation of all `ChatXAI` features and configurations head to the API reference: https://python.langchain.com/api_reference/xai/chat_models/langchain_xai.chat_models.ChatXAI.html" | ||
] | ||
} | ||
], | ||
"metadata": { | ||
"kernelspec": { | ||
"display_name": "Python 3 (ipykernel)", | ||
"language": "python", | ||
"name": "python3" | ||
}, | ||
"language_info": { | ||
"codemirror_mode": { | ||
"name": "ipython", | ||
"version": 3 | ||
}, | ||
"file_extension": ".py", | ||
"mimetype": "text/x-python", | ||
"name": "python", | ||
"nbconvert_exporter": "python", | ||
"pygments_lexer": "ipython3", | ||
"version": "3.11.9" | ||
} | ||
}, | ||
"nbformat": 4, | ||
"nbformat_minor": 5 | ||
} |
Oops, something went wrong.