当前位置: 首页 > news >正文

陕西网站维护制作竞拍网站

陕西网站维护,制作竞拍网站,网站设计的目的,家具设计用什么软件最好系列文章索引 LangChain教程 - 系列文章 LangChain提供了一种灵活且强大的表达式语言 (LangChain Expression Language, LCEL)#xff0c;用于创建复杂的逻辑链。通过将不同的可运行对象组合起来#xff0c;LCEL可以实现顺序链、嵌套链、并行链、路由以及动态构建等高级功能…系列文章索引 LangChain教程 - 系列文章 LangChain提供了一种灵活且强大的表达式语言 (LangChain Expression Language, LCEL)用于创建复杂的逻辑链。通过将不同的可运行对象组合起来LCEL可以实现顺序链、嵌套链、并行链、路由以及动态构建等高级功能从而满足各种场景下的需求。本文将详细介绍这些功能及其实现方式。 顺序链 LCEL的核心功能是将可运行对象按顺序组合起来其中前一个对象的输出会自动传递给下一个对象作为输入。我们可以使用管道操作符 (|) 或显式的 .pipe() 方法来构建顺序链。 以下是一个简单的例子 from langchain_ollama import OllamaLLM from langchain_core.prompts import ChatPromptTemplate from langchain_core.output_parsers import StrOutputParsermodel OllamaLLM(modelqwen2.5:0.5b) prompt ChatPromptTemplate.from_template(tell me a joke about {topic})chain prompt | model | StrOutputParser()result chain.invoke({topic: bears}) print(result)输出 Heres a bear joke for you:Why did the bear dissolve in water? Because it was a polar bear!在上述例子中提示模板将输入格式化为聊天模型的输入格式聊天模型生成笑话最后通过输出解析器将结果转换为字符串。 嵌套链 嵌套链允许我们将多个链组合起来以创建更复杂的逻辑。例如可以将一个生成笑话的链与另一个链组合该链负责分析笑话的有趣程度。 analysis_prompt ChatPromptTemplate.from_template(is this a funny joke? {joke}) composed_chain {joke: chain} | analysis_prompt | model | StrOutputParser()result composed_chain.invoke({topic: bears}) print(result)输出 Haha, thats a clever play on words! Using polar to imply the bear dissolved or became polar/polarized when put in water. Not the most hilarious joke ever, but it has a cute, groan-worthy pun that makes it mildly amusing.并行链 RunnableParallel 使得可以并行运行多个链并将每个链的结果组合成一个字典。这种方式适用于需要同时处理多个任务的场景。 from langchain_core.runnables import RunnableParalleljoke_chain ChatPromptTemplate.from_template(tell me a joke about {topic}) | model poem_chain ChatPromptTemplate.from_template(write a 2-line poem about {topic}) | modelparallel_chain RunnableParallel(jokejoke_chain, poempoem_chain)result parallel_chain.invoke({topic: bear}) print(result)输出 {joke: Why dont bears like fast food? Because they cant catch it!,poem: In the quiet of the forest, the bear roams free\nMajestic and wild, a sight to see. }路由 路由允许根据输入动态选择要执行的子链。LCEL提供了两种实现路由的方式 使用自定义函数 通过 RunnableLambda 实现动态路由 from langchain_core.prompts import PromptTemplate from langchain_core.runnables import RunnableLambdachain (PromptTemplate.from_template(Given the user question below, classify it as either being about LangChain, Anthropic, or Other.Do not respond with more than one word.question {question} /questionClassification:)| OllamaLLM(modelqwen2.5:0.5b)| StrOutputParser() )langchain_chain PromptTemplate.from_template(You are an expert in langchain. \ Always answer questions starting with As Harrison Chase told me. \ Respond to the following question:Question: {question} Answer: ) | OllamaLLM(modelqwen2.5:0.5b) anthropic_chain PromptTemplate.from_template(You are an expert in anthropic. \ Always answer questions starting with As Dario Amodei told me. \ Respond to the following question:Question: {question} Answer: ) | OllamaLLM(modelqwen2.5:0.5b) general_chain PromptTemplate.from_template(Respond to the following question:Question: {question} Answer: ) | OllamaLLM(modelqwen2.5:0.5b)def route(info):if anthropic in info[topic].lower():return anthropic_chainelif langchain in info[topic].lower():return langchain_chainelse:return general_chainfull_chain {topic: chain, question: lambda x: x[question]} | RunnableLambda(route)result full_chain.invoke({question: how do I use LangChain?}) print(result)def route(info):if anthropic in info[topic].lower():return anthropic_chainelif langchain in info[topic].lower():return langchain_chainelse:return general_chainfrom langchain_core.runnables import RunnableLambdafull_chain {topic: chain, question: lambda x: x[question]} | RunnableLambda(route)result full_chain.invoke({question: how do I use LangChain?}) print(result)使用 RunnableBranch RunnableBranch 通过条件匹配选择分支 from langchain_core.runnables import RunnableBranchbranch RunnableBranch((lambda x: anthropic in x[topic].lower(), anthropic_chain),(lambda x: langchain in x[topic].lower(), langchain_chain),general_chain, )full_chain {topic: chain, question: lambda x: x[question]} | branch result full_chain.invoke({question: how do I use Anthropic?}) print(result)动态构建 动态构建链可以根据输入在运行时生成链的部分。通过 RunnableLambda 的返回值机制可以返回一个新的 Runnable。 from langchain_core.runnables import chain, RunnablePassthroughllm OllamaLLM(modelqwen2.5:0.5b)contextualize_instructions Convert the latest user question into a standalone question given the chat history. Dont answer the question, return the question and nothing else (no descriptive text). contextualize_prompt ChatPromptTemplate.from_messages([(system, contextualize_instructions),(placeholder, {chat_history}),(human, {question}),] ) contextualize_question contextualize_prompt | llm | StrOutputParser()chain def contextualize_if_needed(input_: dict):if input_.get(chat_history):return contextualize_questionelse:return RunnablePassthrough() | itemgetter(question)chain def fake_retriever(input_: dict):return egypts population in 2024 is about 111 millionqa_instructions (Answer the user question given the following context:\n\n{context}. ) qa_prompt ChatPromptTemplate.from_messages([(system, qa_instructions), (human, {question})] )full_chain (RunnablePassthrough.assign(questioncontextualize_if_needed).assign(contextfake_retriever)| qa_prompt| llm| StrOutputParser() )result full_chain.invoke({question: what about egypt,chat_history: [(human, whats the population of indonesia),(ai, about 276 million),], }) print(result)输出 According to the context provided, Egypts population in 2024 is estimated to be about 111 million.完整代码实例 from operator import itemgetterfrom langchain_ollama import OllamaLLM from langchain_core.prompts import ChatPromptTemplate from langchain_core.output_parsers import StrOutputParserprint(\n-----------------------------------\n)# Simple demo model OllamaLLM(modelqwen2.5:0.5b) prompt ChatPromptTemplate.from_template(tell me a joke about {topic})chain prompt | model | StrOutputParser()result chain.invoke({topic: bears}) print(result)print(\n-----------------------------------\n)# Compose demo analysis_prompt ChatPromptTemplate.from_template(is this a funny joke? {joke}) composed_chain {joke: chain} | analysis_prompt | model | StrOutputParser()result composed_chain.invoke({topic: bears}) print(result)print(\n-----------------------------------\n)# Parallel demo from langchain_core.runnables import RunnableParalleljoke_chain ChatPromptTemplate.from_template(tell me a joke about {topic}) | model poem_chain ChatPromptTemplate.from_template(write a 2-line poem about {topic}) | modelparallel_chain RunnableParallel(jokejoke_chain, poempoem_chain)result parallel_chain.invoke({topic: bear}) print(result)print(\n-----------------------------------\n)# Route demo from langchain_core.prompts import PromptTemplate from langchain_core.runnables import RunnableLambdachain (PromptTemplate.from_template(Given the user question below, classify it as either being about LangChain, Anthropic, or Other.Do not respond with more than one word.question {question} /questionClassification:)| OllamaLLM(modelqwen2.5:0.5b)| StrOutputParser() )langchain_chain PromptTemplate.from_template(You are an expert in langchain. \ Always answer questions starting with As Harrison Chase told me. \ Respond to the following question:Question: {question} Answer: ) | OllamaLLM(modelqwen2.5:0.5b) anthropic_chain PromptTemplate.from_template(You are an expert in anthropic. \ Always answer questions starting with As Dario Amodei told me. \ Respond to the following question:Question: {question} Answer: ) | OllamaLLM(modelqwen2.5:0.5b) general_chain PromptTemplate.from_template(Respond to the following question:Question: {question} Answer: ) | OllamaLLM(modelqwen2.5:0.5b)def route(info):if anthropic in info[topic].lower():return anthropic_chainelif langchain in info[topic].lower():return langchain_chainelse:return general_chainfull_chain {topic: chain, question: lambda x: x[question]} | RunnableLambda(route)result full_chain.invoke({question: how do I use LangChain?}) print(result)print(\n-----------------------------------\n)# Branch demo from langchain_core.runnables import RunnableBranchbranch RunnableBranch((lambda x: anthropic in x[topic].lower(), anthropic_chain),(lambda x: langchain in x[topic].lower(), langchain_chain),general_chain, )full_chain {topic: chain, question: lambda x: x[question]} | branch result full_chain.invoke({question: how do I use Anthropic?}) print(result)print(\n-----------------------------------\n)# Dynamic demo from langchain_core.runnables import chain, RunnablePassthroughllm OllamaLLM(modelqwen2.5:0.5b)contextualize_instructions Convert the latest user question into a standalone question given the chat history. Dont answer the question, return the question and nothing else (no descriptive text). contextualize_prompt ChatPromptTemplate.from_messages([(system, contextualize_instructions),(placeholder, {chat_history}),(human, {question}),] ) contextualize_question contextualize_prompt | llm | StrOutputParser()chain def contextualize_if_needed(input_: dict):if input_.get(chat_history):return contextualize_questionelse:return RunnablePassthrough() | itemgetter(question)chain def fake_retriever(input_: dict):return egypts population in 2024 is about 111 millionqa_instructions (Answer the user question given the following context:\n\n{context}. ) qa_prompt ChatPromptTemplate.from_messages([(system, qa_instructions), (human, {question})] )full_chain (RunnablePassthrough.assign(questioncontextualize_if_needed).assign(contextfake_retriever)| qa_prompt| llm| StrOutputParser() )result full_chain.invoke({question: what about egypt,chat_history: [(human, whats the population of indonesia),(ai, about 276 million),], }) print(result)print(\n-----------------------------------\n)J-LangChain实现上面实例 J-LangChain - 智能链构建 总结 LangChain的LCEL通过提供顺序链、嵌套链、并行链、路由和动态构建等功能为开发者构建复杂的语言任务提供了强大的工具。无论是简单的逻辑流还是复杂的动态决策LCEL都能高效地满足需求。通过合理使用这些功能开发者可以快速搭建高效、灵活的智能链为各种场景的应用提供支持。
http://www.hkea.cn/news/14492698/

相关文章:

  • 网站模板html下载建筑公司招聘信息
  • 做网站设计难吗wordpress图片显示缩略图
  • php游戏网站建设a032网站模版
  • 北京网站建设价格行情帮企业建网站步骤
  • 中山快速做网站价格中国建筑网官网企业文化
  • 工业互联网平台应用东营seo网站建设费用
  • 开平市城乡建设局网站做资讯网站盈利
  • 仿站多少钱怎么做网站导航外链
  • 中国企业网官方网站下载制作简单的网站
  • 方向专业网站制作咨询威海网站优化
  • 个人网站备案麻烦产品设计说明
  • 科技公司 网站 石家庄ui设计零基础到精通自学
  • 如何做网站优化wordpress 百度云图安装
  • 网络设计公司排名seo的搜索排名影响因素有哪些
  • 商务网站建设调研合肥seo服务商
  • 网站建设服务器都有哪些收费抽奖网站
  • 12306网站为什么做那么差网推怎么做
  • 怎样做免费网站会员郑州粒米seo外包
  • 网站开发 网站设计网站开发用什么框架
  • 前端做一个网站需要些什么软件免费开网店免费供货
  • 做网站费用上海wordpress设置登录和跳转到主页
  • 网站备案 个人wordpress 创建菜单
  • 简单的企业网站模板wordpress 图像滑块插件
  • 网站关键词之间用什么符号隔开泉州网站制作网页
  • 网站开发近期市场wordpress 短信 插件
  • 技术支持保定网站建设 定兴深圳招聘官网
  • 电子商务网站特点网站建设动态
  • 南充网站建设天赐软件学校网站模板
  • php网站开发实例视频营销系统
  • 网站布局方案直播软件定制开发