当前位置: 首页 > news >正文

开发手机网站教程可以上传自己做的视频的网站

开发手机网站教程,可以上传自己做的视频的网站,南昌网上服务,网站主机一般选哪种的介绍 j-langchain是一个Java版的LangChain开发框架#xff0c;旨在简化和加速各类大模型应用在Java平台的落地开发。它提供了一组实用的工具和类#xff0c;使得开发人员能够更轻松地构建类似于LangChain的Java应用程序。 依赖 Maven dependencygroupIdi…介绍 j-langchain是一个Java版的LangChain开发框架旨在简化和加速各类大模型应用在Java平台的落地开发。它提供了一组实用的工具和类使得开发人员能够更轻松地构建类似于LangChain的Java应用程序。 依赖 Maven dependencygroupIdio.github.flower-trees/groupIdartifactIdj-langchain/artifactIdversion1.0.1-preview/version /dependencyGradle implementation io.github.flower-trees:j-langchain:1.0.1-previewNotes: 系统基于salt-function-flow流程编排框架开发具体语法可 参考。 智能链构建 顺序调用 LangChain实现 from langchain_ollama import OllamaLLM from langchain_core.prompts import ChatPromptTemplate from langchain_core.output_parsers import StrOutputParsermodel OllamaLLM(modelqwen2.5:0.5b) prompt ChatPromptTemplate.from_template(tell me a joke about {topic})chain prompt | model | StrOutputParser()result chain.invoke({topic: bears}) print(result)J-LangChain实现 Component public class ChainBuildDemo {AutowiredChainActor chainActor;public void SimpleDemo() {BaseRunnableStringPromptValue, ? prompt PromptTemplate.fromTemplate(tell me a joke about ${topic});ChatOpenAI chatOpenAI ChatOpenAI.builder().model(gpt-4).build();FlowInstance chain chainActor.builder().next(prompt).next(oll).next(new StrOutputParser()).build();ChatGeneration result chainActor.invoke(chain, Map.of(topic, bears));System.out.println(result);} }分支路由 J-LangChain实现 public void SwitchDemo() {BaseRunnableStringPromptValue, ? prompt PromptTemplate.fromTemplate(tell me a joke about ${topic});ChatOllama chatOllama ChatOllama.builder().model(llama3:8b).build();ChatOpenAI chatOpenAI ChatOpenAI.builder().model(gpt-4).build();FlowInstance chain chainActor.builder().next(prompt).next(Info.c(vendor ollama, chatOllama),Info.c(vendor chatgpt, chatOpenAI),Info.c(input - sorry, I dont know how to do that)).next(new StrOutputParser()).build();Generation result chainActor.invoke(chain, Map.of(topic, bears, vendor, ollama));System.out.println(result); }组合嵌套 LangChain实现 analysis_prompt ChatPromptTemplate.from_template(is this a funny joke? {joke}) composed_chain {joke: chain} | analysis_prompt | model | StrOutputParser()result composed_chain.invoke({topic: bears}) print(result)J-LangChain实现 public void ComposeDemo() {ChatOllama llm ChatOllama.builder().model(llama3:8b).build();StrOutputParser parser new StrOutputParser();BaseRunnableStringPromptValue, ? prompt PromptTemplate.fromTemplate(tell me a joke about ${topic});FlowInstance chain chainActor.builder().next(prompt).next(llm).next(parser).build();BaseRunnableStringPromptValue, ? analysisPrompt PromptTemplate.fromTemplate(is this a funny joke? ${joke});FlowInstance analysisChain chainActor.builder().next(chain).next(input - Map.of(joke, ((Generation)input).getText())).next(analysisPrompt).next(llm).next(parser).build();ChatGeneration result chainActor.invoke(analysisChain, Map.of(topic, bears));System.out.println(result);}并行执行 LangChain实现 from langchain_core.runnables import RunnableParalleljoke_chain ChatPromptTemplate.from_template(tell me a joke about {topic}) | model poem_chain ChatPromptTemplate.from_template(write a 2-line poem about {topic}) | modelparallel_chain RunnableParallel(jokejoke_chain, poempoem_chain)result parallel_chain.invoke({topic: bear}) print(result)J-LangChain实现 public void ParallelDemo() {ChatOllama llm ChatOllama.builder().model(llama3:8b).build();BaseRunnableStringPromptValue, ? joke PromptTemplate.fromTemplate(tell me a joke about ${topic});BaseRunnableStringPromptValue, ? poem PromptTemplate.fromTemplate(write a 2-line poem about ${topic});FlowInstance jokeChain chainActor.builder().next(joke).next(llm).build();FlowInstance poemChain chainActor.builder().next(poem).next(llm).build();FlowInstance chain chainActor.builder().concurrent((IResultMapString, String) (iContextBus, isTimeout) - {AIMessage jokeResult iContextBus.getResult(jokeChain.getFlowId());AIMessage poemResult iContextBus.getResult(poemChain.getFlowId());return Map.of(joke, jokeResult.getContent(), poem, poemResult.getContent());}, jokeChain, poemChain).build();MapString, String result chainActor.invoke(chain, Map.of(topic, bears));System.out.println(JsonUtil.toJson(result));}动态路由 LangChain实现 通过 RunnableLambda 实现动态路由 from langchain_core.prompts import PromptTemplate from langchain_core.runnables import RunnableLambdachain (PromptTemplate.from_template(Given the user question below, classify it as either being about LangChain, Anthropic, or Other.Do not respond with more than one word.question {question} /questionClassification:)| OllamaLLM(modelqwen2.5:0.5b)| StrOutputParser() )langchain_chain PromptTemplate.from_template(You are an expert in langchain. \ Always answer questions starting with As Harrison Chase told me. \ Respond to the following question:Question: {question} Answer: ) | OllamaLLM(modelqwen2.5:0.5b) anthropic_chain PromptTemplate.from_template(You are an expert in anthropic. \ Always answer questions starting with As Dario Amodei told me. \ Respond to the following question:Question: {question} Answer: ) | OllamaLLM(modelqwen2.5:0.5b) general_chain PromptTemplate.from_template(Respond to the following question:Question: {question} Answer: ) | OllamaLLM(modelqwen2.5:0.5b)def route(info):if anthropic in info[topic].lower():return anthropic_chainelif langchain in info[topic].lower():return langchain_chainelse:return general_chainfull_chain {topic: chain, question: lambda x: x[question]} | RunnableLambda(route)result full_chain.invoke({question: how do I use LangChain?}) print(result)def route(info):if anthropic in info[topic].lower():return anthropic_chainelif langchain in info[topic].lower():return langchain_chainelse:return general_chainfrom langchain_core.runnables import RunnableLambdafull_chain {topic: chain, question: lambda x: x[question]} | RunnableLambda(route)result full_chain.invoke({question: how do I use LangChain?}) print(result)J-LangChain实现 public void RouteDemo() {ChatOllama llm ChatOllama.builder().model(llama3:8b).build();BaseRunnableStringPromptValue, Object prompt PromptTemplate.fromTemplate(Given the user question below, classify it as either being about LangChain, Anthropic, or Other.Do not respond with more than one word.question${question}/questionClassification:);FlowInstance chain chainActor.builder().next(prompt).next(llm).next(new StrOutputParser()).build();FlowInstance langchainChain chainActor.builder().next(PromptTemplate.fromTemplate(You are an expert in langchain. \Always answer questions starting with As Harrison Chase told me. \Respond to the following question:Question: ${question}Answer:)).next(ChatOllama.builder().model(llama3:8b).build()).build();FlowInstance anthropicChain chainActor.builder().next(PromptTemplate.fromTemplate(You are an expert in anthropic. \Always answer questions starting with As Dario Amodei told me. \Respond to the following question:Question: ${question}Answer:)).next(ChatOllama.builder().model(llama3:8b).build()).build();FlowInstance generalChain chainActor.builder().next(PromptTemplate.fromTemplate(Respond to the following question:Question: ${question}Answer:)).next(ChatOllama.builder().model(llama3:8b).build()).build();FlowInstance fullChain chainActor.builder().next(chain).next(input - Map.of(topic, input, question, ((Map?, ?)ContextBus.get().getFlowParam()).get(question))).next(Info.c(topic anthropic, anthropicChain),Info.c(topic langchain, langchainChain),Info.c(generalChain)).build();AIMessage result chainActor.invoke(fullChain, Map.of(question, how do I use Anthropic?));System.out.println(result.getContent());}动态构建 LangChain实现 from langchain_core.runnables import chain, RunnablePassthroughllm OllamaLLM(modelqwen2.5:0.5b)contextualize_instructions Convert the latest user question into a standalone question given the chat history. Dont answer the question, return the question and nothing else (no descriptive text). contextualize_prompt ChatPromptTemplate.from_messages([(system, contextualize_instructions),(placeholder, {chat_history}),(human, {question}),] ) contextualize_question contextualize_prompt | llm | StrOutputParser()chain def contextualize_if_needed(input_: dict):if input_.get(chat_history):return contextualize_questionelse:return RunnablePassthrough() | itemgetter(question)chain def fake_retriever(input_: dict):return egypts population in 2024 is about 111 millionqa_instructions (Answer the user question given the following context:\n\n{context}. ) qa_prompt ChatPromptTemplate.from_messages([(system, qa_instructions), (human, {question})] )full_chain (RunnablePassthrough.assign(questioncontextualize_if_needed).assign(contextfake_retriever)| qa_prompt| llm| StrOutputParser() )result full_chain.invoke({question: what about egypt,chat_history: [(human, whats the population of indonesia),(ai, about 276 million),], }) print(result)J-LangChain实现 public void DynamicDemo() {ChatOllama llm ChatOllama.builder().model(llama3:8b).build();String contextualizeInstructions Convert the latest user question into a standalone question given the chat history. Dont answer the question, return the question and nothing else (no descriptive text).;BaseRunnableChatPromptValue, Object contextualizePrompt ChatPromptTemplate.fromMessages(List.of(Pair.of(system, contextualizeInstructions),Pair.of(placeholder, ${chatHistory}),Pair.of(human, ${question})));FlowInstance contextualizeQuestion chainActor.builder().next(contextualizePrompt).next(llm).next(new StrOutputParser()).build();FlowInstance contextualizeIfNeeded chainActor.builder().next(Info.c(chatHistory ! null, contextualizeQuestion),Info.c(input - Map.of(question, ((MapString, String)input).get(question)))).build();String qaInstructions Answer the user question given the following context:\n\n${context}.;BaseRunnableChatPromptValue, Object qaPrompt ChatPromptTemplate.fromMessages(List.of(Pair.of(system, qaInstructions),Pair.of(human, ${question})));FlowInstance fullChain chainActor.builder().all((iContextBus, isTimeout) - Map.of(question, iContextBus.getResult(contextualizeIfNeeded.getFlowId()).toString(),context, iContextBus.getResult(fakeRetriever)),Info.c(contextualizeIfNeeded),Info.c(input - egypts population in 2024 is about 111 million).cAlias(fakeRetriever)).next(qaPrompt).next(input - {System.out.println(JsonUtil.toJson(input)); return input;}).next(llm).next(new StrOutputParser()).build();ChatGeneration result chainActor.invoke(fullChain,Map.of(question, what about egypt,chatHistory,List.of(Pair.of(human, whats the population of indonesia),Pair.of(ai, about 276 million))));System.out.println(result);}
http://www.hkea.cn/news/14483805/

相关文章:

  • 快要到期的域名网站欣赏艺术类的网站
  • 做网站设计的公司柳州网站建设入门pdf
  • 做a网站本网站仅支持ie浏览器
  • 网站仿站工具网络营销包括
  • 网站模板信息不存在企业运营管理岗位职责
  • 电子商务网站推广的方式有哪些手机做ppt的软件
  • 黑龙江 网站开发英文网站如何做seo
  • 网站建设用阿里还是华为云微信企业邮箱怎么注册
  • 网站界面设计的发展趋势网站vps
  • 百度公司网站seo方案什么是网站?
  • 惠州专业的免费建站网络广告的特征是()多选题
  • 网站域名可以自己做吗中国私企建筑公司十大排名
  • 网站快速排名优化价格图片编辑在线
  • 手机网站展示vi设计是平面设计吗
  • 麓谷网站建设淘宝网站图片维护怎么做
  • 企业做网站的困惑建筑人才网招收土建预算员实学生
  • 做网站郑州公司网站建设管理概述
  • saas 平台架构做网站多模块内容网页布局设计
  • 网站建设项目方案wordpress里的发消息给我
  • 温州小学网站建设百度竞价
  • 怎么讲解网站php网站分类目录程序 网址导航程序 织梦二次开发
  • wordpress 预览主题插件汉化优化算法分类
  • 安徽省住房和城乡建设厅门户网站万网网站建设 优帮云
  • 网站设计论文引言深圳比较大的外包公司有哪些
  • wordpress搜索插件慢郑州seo代理商
  • 甘肃建设厅网站二级建造师报名时间wordpress的md
  • 度假村网站模板新增病例最新消息
  • 湘潭网站建设 搜搜磐石网络源码下载网站cms
  • 企业大型网站开发建站教程详解网站模板在线预览
  • 免费网站建设信息网站建设设计猫和老鼠