Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stream does not work when using ChatXAI with AgentExecutor #29827

Open
5 tasks done
kboniadi opened this issue Feb 15, 2025 · 2 comments
Open
5 tasks done

Stream does not work when using ChatXAI with AgentExecutor #29827

kboniadi opened this issue Feb 15, 2025 · 2 comments
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature investigate Flagged for investigation.

Comments

@kboniadi
Copy link

kboniadi commented Feb 15, 2025

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

import os

from langchain.agents import (AgentExecutor, create_tool_calling_agent)
from langchain_core.chat_history import InMemoryChatMessageHistory
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_core.runnables.history import RunnableWithMessageHistory
from langchain_core.tools import tool
from pydantic import BaseModel, Field
from langchain_xai import ChatXAI

prompt2 = f"""You are a very smart AI Bot"""
    
# create a client and stream out the results from a basic template
llm = ChatXAI(
    model="grok",
    api_key="...",
    max_tokens=4096,
    top_p=0.95,
    streaming=True,
    max_retries=3,
)

prompt = ChatPromptTemplate.from_messages(
    [
        ("system", prompt2),
        # First put the history
        MessagesPlaceholder(variable_name="chat_history"),
        # Then the new input
        ("human", "{input}"),
        # Finally the scratchpad
        MessagesPlaceholder(variable_name="agent_scratchpad"),
    ]
)
    
@tool
def magic_function(input: int) -> int:
    """Applies a magic function to an input."""
    return input + 2

tools = [
    magic_function
]

        
agent = create_tool_calling_agent(llm, tools, prompt)
agent_executor = AgentExecutor(
    agent=agent,
    tools=tools,
)

memory = InMemoryChatMessageHistory(session_id="test-session")
agent_with_chat_history = RunnableWithMessageHistory(
    agent_executor,
    # This is needed because in most real world scenarios, a session id is needed
    # It isn't really used here because we are using a simple in memory ChatMessageHistory
    lambda session_id: memory,
    input_messages_key="input",
    history_messages_key="chat_history",
)


res = []
for response in agent_with_chat_history.stream({"input": "What is 7+11?<|eot_id|><|start_header_id|>assistant<|end_header_id|>"}, {'configurable': {'session_id': 'test-session'}}):
    print(response)

Error Message and Stack Trace (if applicable)

no errors

Description

I'm trying to stream a response back using ChatXAI and AgentExecutor.
I expect to see data streamed back byte by byte
I instead see the full response returned just like the invoke method

System Info

System Information

OS: Darwin
OS Version: Darwin Kernel Version 23.6.0: Fri Nov 15 15:13:15 PST 2024; root:xnu-10063.141.1.702.7~1/RELEASE_ARM64_T6000
Python Version: 3.10.16 (main, Dec 3 2024, 17:27:57) [Clang 16.0.0 (clang-1600.0.26.4)]

Package Information

langchain_core: 0.3.34
langchain: 0.3.18
langchain_community: 0.3.14
langsmith: 0.2.11
langchain_huggingface: 0.1.2
langchain_openai: 0.3.5
langchain_text_splitters: 0.3.6
langchain_xai: 0.2.0
langgraph_sdk: 0.1.51

Optional packages not installed

langserve

Other Dependencies

aiohttp: 3.11.12
aiohttp<4.0.0,>=3.8.3: Installed. No version info available.
async-timeout<5.0.0,>=4.0.0;: Installed. No version info available.
dataclasses-json: 0.6.7
httpx: 0.28.1
httpx-sse: 0.4.0
huggingface-hub: 0.28.1
jsonpatch<2.0,>=1.33: Installed. No version info available.
langchain-anthropic;: Installed. No version info available.
langchain-aws;: Installed. No version info available.
langchain-cohere;: Installed. No version info available.
langchain-community;: Installed. No version info available.
langchain-core<1.0.0,>=0.3.34: Installed. No version info available.
langchain-deepseek;: Installed. No version info available.
langchain-fireworks;: Installed. No version info available.
langchain-google-genai;: Installed. No version info available.
langchain-google-vertexai;: Installed. No version info available.
langchain-groq;: Installed. No version info available.
langchain-huggingface;: Installed. No version info available.
langchain-mistralai;: Installed. No version info available.
langchain-ollama;: Installed. No version info available.
langchain-openai;: Installed. No version info available.
langchain-text-splitters<1.0.0,>=0.3.6: Installed. No version info available.
langchain-together;: Installed. No version info available.
langsmith-pyo3: Installed. No version info available.
langsmith<0.4,>=0.1.125: Installed. No version info available.
langsmith<0.4,>=0.1.17: Installed. No version info available.
numpy: 1.26.4
numpy<2,>=1.26.4;: Installed. No version info available.
numpy<3,>=1.26.2;: Installed. No version info available.
openai<2.0.0,>=1.58.1: Installed. No version info available.
orjson: 3.10.15
packaging<25,>=23.2: Installed. No version info available.
pydantic: 2.10.6
pydantic-settings: 2.7.1
pydantic<3.0.0,>=2.5.2;: Installed. No version info available.
pydantic<3.0.0,>=2.7.4: Installed. No version info available.
pydantic<3.0.0,>=2.7.4;: Installed. No version info available.
PyYAML: 6.0.2
PyYAML>=5.3: Installed. No version info available.
requests: 2.32.3
requests-toolbelt: 1.0.0
requests<3,>=2: Installed. No version info available.
sentence-transformers: 3.1.1
SQLAlchemy: 2.0.38
SQLAlchemy<3,>=1.4: Installed. No version info available.
tenacity: 9.0.0
tenacity!=8.4.0,<10,>=8.1.0: Installed. No version info available.
tenacity!=8.4.0,<10.0.0,>=8.1.0: Installed. No version info available.
tiktoken<1,>=0.7: Installed. No version info available.
tokenizers: 0.19.1
transformers: 4.43.4
typing-extensions>=4.7: Installed. No version info available.
zstandard: Installed. No version info available.

@kboniadi kboniadi changed the title Steam does not work when using ChatXAI with AgentExecutor Stream does not work when using ChatXAI with AgentExecutor Feb 15, 2025
@langcarl langcarl bot added the investigate Flagged for investigation. label Feb 15, 2025
@dosubot dosubot bot added the 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature label Feb 15, 2025
@AffanBinFaisal
Copy link

Have you tried setting streaming=True instead of False in:

llm = ChatXAI(
    model="grok",
    api_key="...",
    max_tokens=4096,
    top_p=0.95,
    streaming=True,  # ✅ Ensure streaming is enabled
    max_retries=3,
)

@kboniadi
Copy link
Author

kboniadi commented Feb 17, 2025

@AffanBinFaisal yah I tried that. Actually should have left it 'True' for the code paste. Will update. Below is the result of the code when settings it to 'True'

for response in agent_with_chat_history.stream({"input": "What is 7+11?<|eot_id|><|start_header_id|>assistant<|end_header_id|>"}, {'configurable': {'session_id': 'test-session'}}):
    print(response)

{'output': '7 + 11 = 18', 'messages': [AIMessage(content='7 + 11 = 18', additional_kwargs={}, response_metadata={})]}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature investigate Flagged for investigation.
Projects
None yet
Development

No branches or pull requests

2 participants