create_qa_with_structure_chain#
- langchain.chains.openai_functions.qa_with_structure.create_qa_with_structure_chain(
- llm: BaseLanguageModel,
- schema: dict | type[BaseModel],
- output_parser: str = 'base',
- prompt: PromptTemplate | ChatPromptTemplate | None = None,
- verbose: bool = False,
Deprecated since version 0.2.13: This function is deprecated. Refer to this guide on retrieval and question answering with structured responses: https://python.langchain.com/docs/how_to/qa_sources/#structure-sources-in-model-response It will not be removed until langchain==1.0.
Create a question answering chain with structure.
Create a question answering chain that returns an answer with sources based on schema.
- Parameters:
llm (BaseLanguageModel) – Language model to use for the chain.
schema (dict | type[BaseModel]) – Pydantic schema to use for the output.
output_parser (str) – Output parser to use. Should be one of pydantic or base. Default to base.
prompt (PromptTemplate | ChatPromptTemplate | None) – Optional prompt to use for the chain.
verbose (bool) – Whether to run the chain in verbose mode.
- Returns:
The question answering chain.
- Return type: