DAG 编排规划
这个例子是通过AWEL DAG编排来展示如何通过大模型来构建对话应用的例子, 例子中的核心功能上根据用户输入构造大模型参数然后实现与大模型对话的应用。所以整个编排有如下几步:
- 发起Http请求
- 处理请求内容
- 大模型推理
- 解析模型输出结果

同样的,MapOperator与ModelOperator算子是DB-GPT内置算子,所以可以直接引用来使用。
from dbgpt._private.pydantic import BaseModel, Fieldfrom dbgpt.core import ModelMessage, ModelRequestfrom dbgpt.core.awel import DAG, HttpTrigger, MapOperatorfrom dbgpt.model.operator import LLMOperator
自定义算子
同样的,我们需要自定义一个处理用户请求来构造模型输入参数的算子。 首先定义用户请求参数,参数里面需要传两个内容,一个是模型名 model,另一个是用户输入信息 user_input
class TriggerReqBody(BaseModel):model: str = Field(..., description="Model name")user_input: str = Field(..., description="User input")
根据请求参数来构造模型推理参数,自定义一个RequestHandleOperator的算子,此算子继承了MapOperator算子,通过重写map方法,即可实现参数的构造。
class RequestHandleOperator(MapOperator[TriggerReqBody, ModelRequest]):def __init__(self, **kwargs):super().__init__(**kwargs)async def map(self, input_value: TriggerReqBody) -> ModelRequest:messages = [ModelMessage.build_human_message(input_value.user_input)]print(f"Receive input value: {input_value}")return ModelRequest.build_request(input_value.model, messages)
DAG编排
编写好算子之后,下一步即可进行算子的编排,通过AWEL DAG进行算子编排。 trigger >> request_handle_task >> llm_task >> model_parse_task
with DAG("dbgpt_awel_simple_dag_example") as dag:# Receive http request and trigger dag to run.trigger = HttpTrigger("/examples/simple_chat", methods="POST", request_body=TriggerReqBody)request_handle_task = RequestHandleOperator()llm_task = LLMOperator(task_name="llm_task")model_parse_task = MapOperator(lambda out: out.to_dict())trigger >> request_handle_task >> llm_task >> model_parse_task
测试验证
同前面的教程,我们有两种方式进行验证。
- 启动dbgpt_server
python dbgpt/app/dbgpt_server.py - 通过测试环境来启动
if __name__ == "__main__":if dag.leaf_nodes[0].dev_mode:# Development mode, you can run the dag locally for debugging.from dbgpt.core.awel import setup_dev_environmentsetup_dev_environment([dag], port=5555)else:# Production mode, DB-GPT will automatically load and execute the current file after startup.pass
DBGPT_SERVER="http://127.0.0.1:5555"curl -X POST $DBGPT_SERVER/api/v1/awel/trigger/examples/simple_chat \-H "Content-Type: application/json" -d '{"model": "proxyllm","user_input": "hello"}'{"text":"Hello! How can I assist you today?","error_code":0,"model_context":{"prompt_echo_len_char":-1,"has_format_prompt":false},"finish_reason":null,"usage":null,"metrics":{"collect_index":10,"start_time_ms":1704436285946,"end_time_ms":1704436293358,"current_time_ms":1704436293358,"first_token_time_ms":null,"first_completion_time_ms":1704436292905,"first_completion_tokens":null,"prompt_tokens":null,"completion_tokens":null,"total_tokens":null,"speed_per_second":null,"current_gpu_infos":null,"avg_gpu_infos":null}}
:::danger ⚠️注意: 测试端口跟启动端口保持一致
:::
