Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stream does not work with ChatMistralAI and with_structured_output #29860

Open
5 tasks done
brochier opened this issue Feb 18, 2025 · 2 comments
Open
5 tasks done

Stream does not work with ChatMistralAI and with_structured_output #29860

brochier opened this issue Feb 18, 2025 · 2 comments
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature investigate Flagged for investigation.

Comments

@brochier
Copy link

brochier commented Feb 18, 2025

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

import os
import getpass
from langchain_mistralai.chat_models import ChatMistralAI


if "MISTRAL_API_KEY" not in os.environ:
    os.environ["MISTRAL_API_KEY"] = getpass.getpass("Enter your Mistral API key: ")

model = ChatMistralAI(
    temperature=0, max_retries=2, model="mistral-small-latest", streaming=True
)

output_schema = {
    "title": "biggest cities",
    "description": "",
    "type": "object",
    "properties": {
        "cities": {"type": "array", "items": {"type": "string"}, "description": ""}
    },
    "required": ["cities"],
}

model_with_structure = model.with_structured_output(
    output_schema, method="function_calling"
)

for chunk in model_with_structure.stream("What are the 10 biggest cities in France?"):
    print(chunk)

Error Message and Stack Trace (if applicable)

No response

Description

When using ChatMistralAI model with the with_structured_output method, with function_callingmode, the streammethod does not stream anymore (the entire response is received at once). Any idea why this happens ? Is it MistralAI that does not stream the answer in this case ?

System Info

System Information

OS: Darwin
OS Version: Darwin Kernel Version 24.2.0: Fri Dec 6 18:51:28 PST 2024; root:xnu-11215.61.5~2/RELEASE_ARM64_T8112
Python Version: 3.11.6 (main, Apr 8 2024, 17:17:59) [Clang 15.0.0 (clang-1500.3.9.4)]

Package Information

langchain_core: 0.3.35
langchain: 0.3.19
langsmith: 0.2.11
langchain_google_genai: 2.0.9
langchain_mistralai: 0.2.4
langchain_ollama: 0.2.2
langchain_openai: 0.3.0
langchain_text_splitters: 0.3.6

Optional packages not installed

langserve

Other Dependencies

aiohttp<4.0.0,>=3.8.3: Installed. No version info available.
async-timeout<5.0.0,>=4.0.0;: Installed. No version info available.
filetype: 1.2.0
google-generativeai: 0.8.4
httpx: 0.27.2
httpx-sse: 0.4.0
jsonpatch<2.0,>=1.33: Installed. No version info available.
langchain-anthropic;: Installed. No version info available.
langchain-aws;: Installed. No version info available.
langchain-cohere;: Installed. No version info available.
langchain-community;: Installed. No version info available.
langchain-core<1.0.0,>=0.3.34: Installed. No version info available.
langchain-core<1.0.0,>=0.3.35: Installed. No version info available.
langchain-deepseek;: Installed. No version info available.
langchain-fireworks;: Installed. No version info available.
langchain-google-genai;: Installed. No version info available.
langchain-google-vertexai;: Installed. No version info available.
langchain-groq;: Installed. No version info available.
langchain-huggingface;: Installed. No version info available.
langchain-mistralai;: Installed. No version info available.
langchain-ollama;: Installed. No version info available.
langchain-openai;: Installed. No version info available.
langchain-text-splitters<1.0.0,>=0.3.6: Installed. No version info available.
langchain-together;: Installed. No version info available.
langchain-xai;: Installed. No version info available.
langsmith-pyo3: Installed. No version info available.
langsmith<0.4,>=0.1.125: Installed. No version info available.
langsmith<0.4,>=0.1.17: Installed. No version info available.
numpy<2,>=1.26.4;: Installed. No version info available.
numpy<3,>=1.26.2;: Installed. No version info available.
ollama: 0.4.6
openai: 1.59.7
orjson: 3.10.14
packaging<25,>=23.2: Installed. No version info available.
pydantic: 2.10.5
pydantic<3.0.0,>=2.5.2;: Installed. No version info available.
pydantic<3.0.0,>=2.7.4: Installed. No version info available.
pydantic<3.0.0,>=2.7.4;: Installed. No version info available.
PyYAML>=5.3: Installed. No version info available.
requests: 2.32.3
requests-toolbelt: 1.0.0
requests<3,>=2: Installed. No version info available.
SQLAlchemy<3,>=1.4: Installed. No version info available.
tenacity!=8.4.0,<10,>=8.1.0: Installed. No version info available.
tenacity!=8.4.0,<10.0.0,>=8.1.0: Installed. No version info available.
tiktoken: 0.8.0
tokenizers: 0.21.0
typing-extensions>=4.7: Installed. No version info available.
zstandard: Installed. No version info available.

@langcarl langcarl bot added the investigate Flagged for investigation. label Feb 18, 2025
@dosubot dosubot bot added the 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature label Feb 18, 2025
@brochier
Copy link
Author

If anyone could propose a quick fix to maintain the custom JSON mode while maintaining the stream response it would be amazing ! I tried to deep dive into the code but I can't understand where exactly the json parser is defined when using the with_structured_output method and thus I can't verify if the partial=True is passed.

@andrasfe
Copy link

I don't know if there is more to it, but adding the underscore to "biggest cities" fixed it for me.

The error is coming from the output_schema definition, specifically with the title "biggest cities". According to the error message, function names must only contain alphanumeric characters, underscores, and dashes (no spaces).
Here's how to fix it:

`// ... existing code ...

output_schema = {
"title": "biggest_cities", # Changed from "biggest cities" to "biggest_cities"
"description": "",
"type": "object",
"properties": {
"cities": {"type": "array", "items": {"type": "string"}, "description": ""}
},
"required": ["cities"],
}

// ... existing code ...`

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature investigate Flagged for investigation.
Projects
None yet
Development

No branches or pull requests

2 participants