当前位置: 首页 > news >正文

门户网站代码数学网站建设方法

门户网站代码,数学网站建设方法,怎么生成网页链接,南京手机app开发公司前言 该系列教程的代码: https://github.com/shar-pen/Langchain-MiniTutorial 我主要参考 langchain 官方教程, 有选择性的记录了一下学习内容 这是教程清单 1.初试langchain2.prompt3.OutputParser/输出解析4.model/vllm模型部署和langchain调用5.DocumentLoader/多种文档…前言 该系列教程的代码: https://github.com/shar-pen/Langchain-MiniTutorial 我主要参考 langchain 官方教程, 有选择性的记录了一下学习内容 这是教程清单 1.初试langchain2.prompt3.OutputParser/输出解析4.model/vllm模型部署和langchain调用5.DocumentLoader/多种文档加载器6.TextSplitter/文档切分7.Embedding/文本向量化8.VectorStore/向量数据库存储和检索9.Retriever/检索器10.Reranker/文档重排序11.RAG管道/多轮对话RAG12.Agent/工具定义/Agent调用工具/Agentic RAG Prompt Template Prompt 模板对于生成动态且灵活的提示至关重要可用于各种场景例如会话历史记录、结构化输出和特定查询。 在本教程中我们将探讨创建 PromptTemplate 对象的方法应用部分变量通过 YAML 文件管理模板并利用 ChatPromptTemplate 和 MessagePlaceholder 等高级工具来增强功能。 from langchain_openai import ChatOpenAIllm ChatOpenAI(base_urlhttp://localhost:5551/v1,api_keyEMPTY,model_nameQwen2.5-7B-Instruct,temperature0.2, )创建 PromptTemplate 对象 有两种方法可以创建 PromptTemplate 对象 1. 使用 from_template() 方法。2. 直接创建 PromptTemplate 对象并同时生成提示词。 方法 1. 使用 from_template() 方法 使用 {variable} 语法定义模板其中 variable 代表可替换的变量。 from langchain_core.prompts import PromptTemplate# {}内部是变量 template What is the capital of {country}?# 使用from_template函数来创建模板 prompt PromptTemplate.from_template(template) promptPromptTemplate(input_variables[country], input_types{}, partial_variables{}, templateWhat is the capital of {country}?)PromptTemplate(input_variables[country], input_types{}, partial_variables{}, templateWhat is the capital of {country}?)类已经解析出country这个变量可以通过为变量 country 赋值来完成提示词。 # 类似str的format方法来创建实例 prompt.format(countryUnited States of America)What is the capital of United States of America?进一步用chain来简化流程 template What is the capital of {country}? prompt PromptTemplate.from_template(template) chain prompt | llm chain.invoke(United States of America).contentThe capital of the United States of America is Washington, D.C.方法 2. 直接创建 PromptTemplate 对象并同时生成提示 明确指定 input_variables 以进行额外的验证。否则如果 input_variables 与模板字符串中的变量不匹配实例化时可能会引发异常。 from langchain_core.prompts import PromptTemplate # Define template template What is the capital of {country}?# Create a prompt template with PromptTemplate object prompt PromptTemplate(templatetemplate,input_variables[country], ) promptPromptTemplate(input_variables[country], input_types{}, partial_variables{}, templateWhat is the capital of {country}?)partial variables 可临时固定的可变参数, 是特殊的 input_variables, 是对应 input_variables 在缺失时的默认值。 使用 partial_variables您可以部分应用函数。这在需要共享 通用变量 时特别有用。 常见示例 日期或时间date or time 是典型的应用场景。 例如假设您希望在提示中指定当前日期 直接硬编码日期 或 每次手动传递日期变量 可能不太灵活。更好的方法 是使用一个返回当前日期的函数将其部分应用于提示模板从而动态填充日期变量使提示更具适应性。 from langchain_core.prompts import PromptTemplate # Define template template What are the capitals of {country1} and {country2}, respectively?# Create a prompt template with PromptTemplate object prompt PromptTemplate(templatetemplate,input_variables[country1],partial_variables{country2: United States of America # Pass partial_variables in dictionary form}, ) promptPromptTemplate(input_variables[country1], input_types{}, partial_variables{country2: United States of America}, templateWhat are the capitals of {country1} and {country2}, respectively?)prompt.format(country1South Korea)What are the capitals of South Korea and United States of America, respectively?通过partial()函数修改或者增加临时变量, 或者直接修改 PromptTemplate.partial_variables prompt_partial prompt.partial(country2“India”), 可创建新实例的同时保留原实例prompt.partial_variables {‘country2’:‘china’}, 直接修改原实例 prompt_partial prompt.partial(country2India) prompt_partial.format(country1South Korea)What are the capitals of South Korea and India, respectively?prompt.partial_variables {country2:china} prompt.format(country1South Korea)What are the capitals of South Korea and china, respectively?partial variables 可以临时用新值, 不会影响缺失时的默认值 print(prompt_partial.format(country1South Korea, country2Canada)) print(prompt_partial.format(country1South Korea))What are the capitals of South Korea and Canada, respectively? What are the capitals of South Korea and India, respectively?partial variables 可用函数传递, 不需要手动设置新值 from datetime import datetimedef get_today():return datetime.now().strftime(%B %d)prompt PromptTemplate(templateTodays date is {today}. Please list {n} celebrities whose birthday is today. Please specify their date of birth.,input_variables[n],partial_variables{today: get_today # Pass partial_variables in dictionary form}, )prompt.format(n3)Todays date is January 30. Please list 3 celebrities whose birthday is today. Please specify their date of birth.从 YAML 文件加载 Prompt 模板 您可以将 Prompt 模板 存储在单独的 YAML 文件 中并使用 load_prompt 进行加载和管理。 以下是一个yaml示例: _type: prompt template: What is the color of {fruit}? input_variables: [fruit]from langchain_core.prompts import load_promptprompt load_prompt(prompts/fruit_color.yaml, encodingutf-8) promptChatPromptTemplate ChatPromptTemplate 可用于将会话历史记录包含到提示词中以提供上下文信息。消息以 (role, message) 元组的形式组织并存储在 列表 中。 角色role: system 系统设置信息通常用于全局指令或设定 AI 的行为。human 用户输入的消息。ai AI 生成的响应消息。 from langchain_core.prompts import ChatPromptTemplatechat_prompt ChatPromptTemplate.from_template(What is the capital of {country}?) chat_promptChatPromptTemplate(input_variables[country], input_types{}, partial_variables{}, messages[HumanMessagePromptTemplate(promptPromptTemplate(input_variables[country], input_types{}, partial_variables{}, templateWhat is the capital of {country}?), additional_kwargs{})])ChatPromptTemplate(input_variables[‘country’], input_types{}, partial_variables{}, messages[HumanMessagePromptTemplate(promptPromptTemplate(input_variables[‘country’], input_types{}, partial_variables{}, template‘What is the capital of {country}?’), additional_kwargs{})]) 注意这个prompt被 HumanMessagePromptTemplate包装了而且位于一个list中 chat_prompt.format(countryUnited States of America)Human: What is the capital of United States of America?多角色 使用 ChatPromptTemplate.from_messages来定义模板, 内部是 chat list, 每个 chat 都是以 (role, message) 元组的形式组织 from langchain_core.prompts import ChatPromptTemplatechat_template ChatPromptTemplate.from_messages([# role, message(system, You are a friendly AI assistant. Your name is {name}.),(human, Nice to meet you!),(ai, Hello! How can I assist you?),(human, {user_input}),] )# Create chat messages messages chat_template.format_messages(nameTeddy, user_inputWhat is your name?) messages[SystemMessage(contentYou are a friendly AI assistant. Your name is Teddy., additional_kwargs{}, response_metadata{}),HumanMessage(contentNice to meet you!, additional_kwargs{}, response_metadata{}),AIMessage(contentHello! How can I assist you?, additional_kwargs{}, response_metadata{}),HumanMessage(contentWhat is your name?, additional_kwargs{}, response_metadata{})]可直接用上面的 Message list 的形式调用大模型 from langchain_openai import ChatOpenAIllm ChatOpenAI(base_urlhttp://localhost:5551/v1,api_keyEMPTY,model_nameQwen2.5-7B-Instruct,temperature0.2, ) llm.invoke(messages).contentMy name is Teddy. Its nice to meet you! How can I help you today?MessagePlaceholder LangChain 提供了 MessagePlaceholder用途包括: 当不确定使用哪些角色 作为消息提示模板的一部分时它可以提供灵活性。在格式化时插入一组消息列表适用于动态会话历史记录的场景。 from langchain_core.output_parsers import StrOutputParser from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholderchat_prompt ChatPromptTemplate.from_messages([(system,You are a summarization specialist AI assistant. Your mission is to summarize conversations using key points.,),MessagesPlaceholder(variable_nameconversation),(human, Summarize the conversation so far in {word_count} words.),] ) chat_promptChatPromptTemplate(input_variables[conversation, word_count], input_types{conversation: list[typing.Annotated[typing.Union[typing.Annotated[langchain_core.messages.ai.AIMessage, Tag(tagai)], typing.Annotated[langchain_core.messages.human.HumanMessage, Tag(taghuman)], typing.Annotated[langchain_core.messages.chat.ChatMessage, Tag(tagchat)], typing.Annotated[langchain_core.messages.system.SystemMessage, Tag(tagsystem)], typing.Annotated[langchain_core.messages.function.FunctionMessage, Tag(tagfunction)], typing.Annotated[langchain_core.messages.tool.ToolMessage, Tag(tagtool)], typing.Annotated[langchain_core.messages.ai.AIMessageChunk, Tag(tagAIMessageChunk)], typing.Annotated[langchain_core.messages.human.HumanMessageChunk, Tag(tagHumanMessageChunk)], typing.Annotated[langchain_core.messages.chat.ChatMessageChunk, Tag(tagChatMessageChunk)], typing.Annotated[langchain_core.messages.system.SystemMessageChunk, Tag(tagSystemMessageChunk)], typing.Annotated[langchain_core.messages.function.FunctionMessageChunk, Tag(tagFunctionMessageChunk)], typing.Annotated[langchain_core.messages.tool.ToolMessageChunk, Tag(tagToolMessageChunk)]], FieldInfo(annotationNoneType, requiredTrue, discriminatorDiscriminator(discriminatorfunction _get_type at 0x7ff1a966cfe0, custom_error_typeNone, custom_error_messageNone, custom_error_contextNone))]]}, partial_variables{}, messages[SystemMessagePromptTemplate(promptPromptTemplate(input_variables[], input_types{}, partial_variables{}, templateYou are a summarization specialist AI assistant. Your mission is to summarize conversations using key points.), additional_kwargs{}), MessagesPlaceholder(variable_nameconversation), HumanMessagePromptTemplate(promptPromptTemplate(input_variables[word_count], input_types{}, partial_variables{}, templateSummarize the conversation so far in {word_count} words.), additional_kwargs{})])formatted_chat_prompt chat_prompt.format(word_count5,conversation[(human, Hello! I’m Teddy. Nice to meet you.),(ai, Nice to meet you! I look forward to working with you.),], )print(formatted_chat_prompt)System: You are a summarization specialist AI assistant. Your mission is to summarize conversations using key points. Human: Hello! I’m Teddy. Nice to meet you. AI: Nice to meet you! I look forward to working with you. Human: Summarize the conversation so far in 5 words.Few-Shot Prompting LangChain 的 Few-Shot Prompting 提供了一种强大的框架通过提供精心挑选的示例引导语言模型生成高质量的输出。此技术减少了大量模型微调的需求同时确保在各种应用场景中提供精准且符合上下文的结果。 Few-Shot Prompt 模板 通过嵌入示例定义提示的结构和格式指导模型生成一致的输出。 示例选择策略Example Selection Strategies 动态选择最相关的示例 以匹配特定查询增强模型的上下文理解能力提高响应准确性。 Chroma 向量存储Chroma Vector Store 用于存储和检索基于语义相似度的示例提供可扩展且高效的 Prompt 结构构建。 FewShotPromptTemplate Few-shot prompting 是一种强大的技术它通过提供少量精心设计的示例引导语言模型生成准确且符合上下文的输出。LangChain 的 FewShotPromptTemplate 简化了这一过程使用户能够构建灵活且可复用的提示适用于问答、摘要、文本校正等任务。 1. 设计 Few-Shot 提示Designing Few-Shot Prompts 定义示例展示所需的输出结构和风格。确保示例覆盖边界情况以增强模型的理解能力和性能。 2. 动态示例选择Dynamic Example Selection 利用语义相似性或向量搜索选择最相关的示例以匹配输入查询。 3. 集成 Few-Shot 提示Integrating Few-Shot Prompts 结合 Prompt 模板与语言模型构建强大的链式调用以生成高质量的响应。 from langchain_openai import ChatOpenAIllm ChatOpenAI(base_urlhttp://localhost:5551/v1,api_keyEMPTY,model_nameQwen2.5-7B-Instruct,temperature0.2, )# User query question What is the capital of United States of America?# Query the model response llm.invoke(question)# Print the response print(response.content)The capital of the United States of America is Washington, D.C.以下是一个 CoT 的示例prompt from langchain_core.prompts import PromptTemplate, FewShotPromptTemplate# Define examples for the few-shot prompt examples [{question: Who lived longer, Steve Jobs or Einstein?,answer: Does this question require additional questions: Yes. Additional Question: At what age did Steve Jobs die? Intermediate Answer: Steve Jobs died at the age of 56. Additional Question: At what age did Einstein die? Intermediate Answer: Einstein died at the age of 76. The final answer is: Einstein ,},{question: When was the founder of Naver born?,answer: Does this question require additional questions: Yes. Additional Question: Who is the founder of Naver? Intermediate Answer: Naver was founded by Lee Hae-jin. Additional Question: When was Lee Hae-jin born? Intermediate Answer: Lee Hae-jin was born on June 22, 1967. The final answer is: June 22, 1967 ,},{question: Who was the reigning king when Yulgok Yis mother was born?,answer: Does this question require additional questions: Yes. Additional Question: Who is Yulgok Yis mother? Intermediate Answer: Yulgok Yis mother is Shin Saimdang. Additional Question: When was Shin Saimdang born? Intermediate Answer: Shin Saimdang was born in 1504. Additional Question: Who was the king of Joseon in 1504? Intermediate Answer: The king of Joseon in 1504 was Yeonsangun. The final answer is: Yeonsangun ,},{question: Are the directors of Oldboy and Parasite from the same country?,answer: Does this question require additional questions: Yes. Additional Question: Who is the director of Oldboy? Intermediate Answer: The director of Oldboy is Park Chan-wook. Additional Question: Which country is Park Chan-wook from? Intermediate Answer: Park Chan-wook is from South Korea. Additional Question: Who is the director of Parasite? Intermediate Answer: The director of Parasite is Bong Joon-ho. Additional Question: Which country is Bong Joon-ho from? Intermediate Answer: Bong Joon-ho is from South Korea. The final answer is: Yes ,}, ]example_prompt PromptTemplate.from_template(Question:\n{question}\nAnswer:\n{answer} )# Print the first formatted example print(example_prompt.format(**examples[0]))Question: Who lived longer, Steve Jobs or Einstein? Answer: Does this question require additional questions: Yes. Additional Question: At what age did Steve Jobs die? Intermediate Answer: Steve Jobs died at the age of 56. Additional Question: At what age did Einstein die? Intermediate Answer: Einstein died at the age of 76. The final answer is: Einstein以下这个 FewShotPromptTemplate 将 examples 以 example_prompt 格式添加到真正 QA 的前面。真正的 QA 按照 suffix 格式展示 # Initialize the FewShotPromptTemplate few_shot_prompt FewShotPromptTemplate(examplesexamples,example_promptexample_prompt,suffixQuestion:\n{question}\nAnswer:,input_variables[question], )# Example question question How old was Bill Gates when Google was founded?# Generate the final prompt final_prompt few_shot_prompt.format(questionquestion) print(final_prompt)Question: Who lived longer, Steve Jobs or Einstein? Answer: Does this question require additional questions: Yes. Additional Question: At what age did Steve Jobs die? Intermediate Answer: Steve Jobs died at the age of 56. Additional Question: At what age did Einstein die? Intermediate Answer: Einstein died at the age of 76. The final answer is: EinsteinQuestion: When was the founder of Naver born? Answer: Does this question require additional questions: Yes. Additional Question: Who is the founder of Naver? Intermediate Answer: Naver was founded by Lee Hae-jin. Additional Question: When was Lee Hae-jin born? Intermediate Answer: Lee Hae-jin was born on June 22, 1967. The final answer is: June 22, 1967Question: Who was the reigning king when Yulgok Yis mother was born? Answer: Does this question require additional questions: Yes. Additional Question: Who is Yulgok Yis mother? Intermediate Answer: Yulgok Yis mother is Shin Saimdang. Additional Question: When was Shin Saimdang born? Intermediate Answer: Shin Saimdang was born in 1504. Additional Question: Who was the king of Joseon in 1504? Intermediate Answer: The king of Joseon in 1504 was Yeonsangun. The final answer is: YeonsangunQuestion: Are the directors of Oldboy and Parasite from the same country? Answer: Does this question require additional questions: Yes. Additional Question: Who is the director of Oldboy? Intermediate Answer: The director of Oldboy is Park Chan-wook. Additional Question: Which country is Park Chan-wook from? Intermediate Answer: Park Chan-wook is from South Korea. Additional Question: Who is the director of Parasite? Intermediate Answer: The director of Parasite is Bong Joon-ho. Additional Question: Which country is Bong Joon-ho from? Intermediate Answer: Bong Joon-ho is from South Korea. The final answer is: YesQuestion: How old was Bill Gates when Google was founded? Answer:response llm.invoke(final_prompt) print(response.content)Does this question require additional questions: Yes. Additional Question: When was Google founded? Intermediate Answer: Google was founded in 1998. Additional Question: When was Bill Gates born? Intermediate Answer: Bill Gates was born on October 28, 1955. The final answer is: Bill Gates was 43 years old when Google was founded.特殊 prompt RAG 文档分析 基于检索到的文档上下文处理并回答问题确保高准确性和高相关性。 from langchain.prompts import ChatPromptTemplatesystem You are a precise and helpful AI assistant specializing in question-answering tasks based on provided context. Your primary task is to: 1. Analyze the provided context thoroughly 2. Answer questions using ONLY the information from the context 3. Preserve technical terms and proper nouns exactly as they appear 4. If the answer cannot be found in the context, respond with: The provided context does not contain information to answer this question. 5. Format responses in clear, readable paragraphs with relevant examples when available 6. Focus on accuracy and clarity in your responses human #Question: {question}#Context: {context}#Answer: Please provide a focused, accurate response that directly addresses the question using only the information from the provided context.prompt ChatPromptTemplate.from_messages([(system, system), (human, human)] )promptChatPromptTemplate(input_variables[context, question], input_types{}, partial_variables{}, messages[SystemMessagePromptTemplate(promptPromptTemplate(input_variables[], input_types{}, partial_variables{}, templateYou are a precise and helpful AI assistant specializing in question-answering tasks based on provided context.\nYour primary task is to:\n1. Analyze the provided context thoroughly\n2. Answer questions using ONLY the information from the context\n3. Preserve technical terms and proper nouns exactly as they appear\n4. If the answer cannot be found in the context, respond with: The provided context does not contain information to answer this question.\n5. Format responses in clear, readable paragraphs with relevant examples when available\n6. Focus on accuracy and clarity in your responses\n), additional_kwargs{}), HumanMessagePromptTemplate(promptPromptTemplate(input_variables[context, question], input_types{}, partial_variables{}, template#Question:\n{question}\n\n#Context:\n{context}\n\n#Answer:\nPlease provide a focused, accurate response that directly addresses the question using only the information from the provided context.), additional_kwargs{})])具有来源归因的 RAGRAG with Source Attribution 增强型 RAG 实现支持详细的来源追踪和引用以提高可追溯性和验证可靠性。 from langchain.prompts import ChatPromptTemplatesystem You are a precise and thorough AI assistant that provides well-documented answers with source attribution. Your responsibilities include: 1. Analyzing provided context thoroughly 2. Generating accurate answers based solely on the given context 3. Including specific source references for each key point 4. Preserving technical terminology exactly as presented 5. Maintaining clear citation format [source: page/document] 6. If information is not found in the context, state: The provided context does not contain information to answer this question.Format your response as: 1. Main Answer 2. Sources Used (with specific locations) 3. Confidence Level (High/Medium/Low)human #Question: {question}#Context: {context}#Answer: Please provide a detailed response with source citations using only information from the provided context.prompt ChatPromptTemplate.from_messages([(system, system), (human, human)] ) PROMPT_OWNER eun hub.push(f{PROMPT_OWNER}/{prompt_title}, prompt, new_repo_is_publicTrue)其实在回答要求里加入了源引用的要求 LLM 响应评估LLM Response Evaluation 基于多项质量指标对 LLM 响应进行全面评估并提供详细的评分方法。 from langchain.prompts import PromptTemplateevaluation_prompt Evaluate the LLMs response based on the following criteria:INPUT: Question: {question} Context: {context} LLM Response: {answer}EVALUATION CRITERIA: 1. Accuracy (0-10) - Perfect (10): Completely accurate, perfectly aligned with context - Good (7-9): Minor inaccuracies - Fair (4-6): Some significant inaccuracies - Poor (0-3): Major inaccuracies or misalignment2. Completeness (0-10) - Perfect (10): Comprehensive coverage of all relevant points - Good (7-9): Covers most important points - Fair (4-6): Missing several key points - Poor (0-3): Critically incomplete3. Context Relevance (0-10) - Perfect (10): Optimal use of context - Good (7-9): Good use with minor omissions - Fair (4-6): Partial use of relevant context - Poor (0-3): Poor context utilization4. Clarity (0-10) - Perfect (10): Exceptionally clear and well-structured - Good (7-9): Clear with minor issues - Fair (4-6): Somewhat unclear - Poor (0-3): Confusing or poorly structuredSCORING METHOD: 1. Calculate individual scores 2. Compute weighted average:- Accuracy: 40%- Completeness: 25%- Context Relevance: 25%- Clarity: 10% 3. Normalize to 0-1 scaleOUTPUT FORMAT: {individual_scores: {accuracy: float,completeness: float,context_relevance: float,clarity: float},weighted_score: float,normalized_score: float,evaluation_notes: string }Return ONLY the normalized_score as a decimal between 0 and 1.prompt PromptTemplate.from_template(evaluation_prompt)
http://www.hkea.cn/news/14506623/

相关文章:

  • 快速做网站的方法wordpress 默认密码
  • 网站建设甲方给乙方的需求方案学做app
  • 泉州专业制作网站开发wordpress主题小工具修改
  • 健身网站模板关于家乡的网页制作教程
  • 高质量的合肥网站建设企业门户网建设
  • 阿里巴巴网站装修宏升温岭网站建设
  • 公司网站的设计规划常州网站建设运营
  • 中国工程建设招聘信息网站什么是网站外链
  • 药材公司网站建设模板做网站设计的总结
  • 哪些在线网站可以做系统进化树长春网站制作可选吉网传媒好
  • 永康建设投标网站wordpress网站怎么优化
  • 门户网站如何制作上海长宁建设和交通门户网站
  • 肇庆网站建设咨询微信网站跳转链接怎么做
  • 滕州外贸网站建设如果给公司做网站
  • 陕西 网站建设首选公司重庆建设工程信息网官网入口30系统登录页面
  • 东莞市建设局网站首页婚纱网站开发进度表
  • 计算机课程网站建设实训报告总结常见网站安全漏洞
  • 西海岸建设局网站网站怎么没有排名
  • 做公司网站好处虹口高端网站建设
  • 在线创建网站免费网站江门市骏业纸制品有限公司
  • an网站建设网站开发者都是英文怎样开发呢
  • 怎么看出是模板网站建筑模板是干嘛用的
  • 图书馆网站信息化建设网站流量狂刷器
  • 网站的投资和建设项目安阳县面积
  • 课桌公司网站建设网站详情页链接怎么做
  • 如何用ps做网站页面设计上海企业黄页
  • 备案ip 查询网站查询软文兼职
  • 淘宝客如何做自己的网站网站建设合同违约
  • 外贸网站推广公司最大网站安全检测工具网站
  • 网站角色权限做网站需要编程吗