1.框架介绍:

    Solon(OpenSolon) | 官网 (Spring 替代方案)

   Ollama

   AnythingLLM | The all-in-one AI application for everyone

2.需求描述:

   1.调用ollama部署的本地大模型来实现问答以及知识的检索。

   2.用大模型来实现决策功能

3.示例代码:

  注意:本次使用的版本是3.1.0-SNAPSHOT,等正式版放出后,可能需要做相应的修改

3.1 与大模型chat和stream-chat

       pom文件:

        

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <parent>
        <groupId>org.noear</groupId>
        <artifactId>solon-parent</artifactId>
        <version>3.1.0-SNAPSHOT</version>
        <relativePath />
    </parent>

    <groupId>com.wht</groupId>
    <artifactId>solon-ai-examples</artifactId>
    <version>1.0</version>
    
    <packaging>jar</packaging>

    <description>Demo project for Solon</description>

    <properties>
        <java.version>1.8</java.version>
    </properties>

    <dependencies>
        <dependency>
            <groupId>org.noear</groupId>
            <artifactId>solon-lib</artifactId>
        </dependency>
        
        <dependency>
            <groupId>org.noear</groupId>
            <artifactId>solon-logging-logback</artifactId>
        </dependency>
        
        <dependency>
            <groupId>org.projectlombok</groupId>
            <artifactId>lombok</artifactId>
            <scope>provided</scope>
        </dependency>

        <dependency>
            <groupId>org.noear</groupId>
            <artifactId>solon-test</artifactId>
            <scope>test</scope>
        </dependency>

        <dependency>
            <groupId>cn.hutool</groupId>
            <artifactId>hutool-all</artifactId>
            <version>5.8.31</version>
        </dependency>

        <dependency>
            <groupId>org.noear</groupId>
            <artifactId>solon-ai</artifactId>
            <version>3.1.0-SNAPSHOT</version>
        </dependency>

    </dependencies>

    <build>
        <finalName>${project.artifactId}</finalName>

        <plugins>
            <plugin>
                <groupId>org.noear</groupId>
                <artifactId>solon-maven-plugin</artifactId>
            </plugin>
        </plugins>
    </build>
    
    <repositories>		
        <repository>
			<id>tencent</id>
			<url>https://mirrors.cloud.tencent.com/nexus/repository/maven-public/</url>
			<snapshots>
				<enabled>false</enabled>
			</snapshots>
		</repository>
    </repositories>

</project>

app.yml:

# llm 配置
solon.ai:
  ollama:
    apiUrl: "http://127.0.0.1:11434/api/chat"
    provider: "ollama"
    model: "qwen2.5:14b"

LLMConfig:

@Configuration
public class LLMConfig {

    @Bean
    public ChatModel build(@Inject("${solon.ai.ollama}") ChatConfig config) {
        return ChatModel.of(config).build();
    }
}

代码:

package com.wht.ai;

import lombok.SneakyThrows;
import org.junit.jupiter.api.Test;
import org.noear.solon.ai.chat.ChatModel;
import org.noear.solon.ai.chat.ChatResponse;
import org.noear.solon.annotation.Inject;
import org.noear.solon.rx.SimpleSubscriber;
import org.noear.solon.test.HttpTester;
import org.noear.solon.test.SolonTest;
import org.reactivestreams.Publisher;

import java.util.concurrent.CountDownLatch;


@SolonTest(value = App.class)
public class ChatTest extends HttpTester {

    @Inject
    ChatModel chatModel;

    @SneakyThrows
    @Test
    public void chat() {

        ChatResponse response = chatModel.prompt("你是谁").call();

        System.err.println(response.getMessage());
    }

    @SneakyThrows
    @Test
    public void chatStream() {

        Publisher<ChatResponse> publisher = chatModel.prompt("你是谁").stream();

        CountDownLatch latch = new CountDownLatch(1);

        publisher.subscribe(new SimpleSubscriber<ChatResponse>()
                .doOnNext(resp -> System.err.println(resp.getMessage()))
                .doOnComplete(() -> System.err.println("complete"))
                .doOnError(err -> System.err.println("error: " + err.getMessage()))
        );

        latch.await();
    }


}

3.2大模型的决策

        场景:顾客就餐

        需求:可以给顾客推荐菜品、菜品的价格、顾客点餐下单(决策)、顾客陪聊(chat)

        逻辑:1.当顾客问菜品和菜品的价格时,去查询数据库里的菜品和价格

                  2.顾客点餐,自动生成订单信息

                  3.用基础大模型陪客户闲聊(可结合anythingLLM 知识库)

       function call:是否支持,需要查询模型提供方相关说明(deepseek-r1不支持)

       参考:

     如何使用 Function Calling 功能_大模型服务平台百炼(Model Studio)-阿里云帮助中心

     示例代码:

    

@Component
public class MenuFunction {

    @FunctionMapping(name = "menuInfo", description = "介绍菜品信息")
    public String menuInfo() {

        //此处可以查询数据库,返回菜品信息
        return StrUtil.format("顾客,您好!\n" +
                "我们店的菜品非常丰富:\n" +
                "荤菜有:香辣鸡杂、毛豆烧鸡、红烧肉...\n" +
                "素菜有:蒸蛋、西红柿鸡蛋、油炸大白菜....");
    }

    @FunctionMapping(name = "getPrice", description = "查询菜品的价格")
    public String getPrice(@FunctionParam(description = "根据问题推测菜品名") String name) {

        //此处可以根据菜品名查询数据库,返回价格
        double price = 9.9;
        return StrUtil.format("{}的价格是:{}元", name, price);
    }

    @FunctionMapping(name = "order", description = "下单")
    public String order(@FunctionParam(description = "根据问题推测所有的菜品名,用“,”隔开,不要出现其他的字符") String order) {

        //此处可以根据菜单名自动生成订单
        System.err.println("function order, order = " + order);
        return StrUtil.format("您下单的菜品是:{}", order);
    }

}

    anythingLLM 方言: 示例,没有封装其他的参数

public class AnythingLLMDialect extends AbstractDialect {

    @Override
    public boolean matched(ChatConfig config) {
        return "anythingLLM".equals(config.provider());
    }

    @Override
    public String buildRequestJson(ChatConfig config, ChatOptions options, List<ChatMessage> messages, boolean stream) {

        return JSONUtil.createObj().set("message", messages.get(0).getContent()).set("mode", "chat").toString();
    }

    @Override
    public boolean parseResponseJson(ChatConfig config, ChatResponseAmend resp, String json) {

        json = json.replaceFirst("data:", "");

        JSONObject parseObj = JSONUtil.parseObj(json);

        resp.setFinished(parseObj.getBool("close"));
        resp.addChoice(new ChatChoice(0, new Date(), parseObj.getStr("close"),
                new AssistantMessage(parseObj.getStr("textResponse"), null, null, null)));

        return true;
    }
}

      测试:显卡显存不够,也可以用3B、7B模型

public class FunctionTest {

    @SneakyThrows
    public static void main(String[] args) {

        //Ollama qwen2.5 模型
        ChatModel chatModel = ChatModel.of("http://127.0.0.1:11434/api/chat")
                .provider("ollama")
                .model("qwen2.5:14b")
                //添加工具类方法
                .globalFunctionAdd(new MenuFunction())
                .build();

        //注册anythingLLM方言
        ChatDialectManager.register(new AnythingLLMDialect());

        //构建anythingLLM ChatModel
        ChatModel anythingModel = ChatModel.of("http://localhost:3001/api/v1/workspace/solon-test/chat")
                .headerSet("Authorization", "Bearer TY5KDCS-3EQ47FB-G1NAD4N-W4YXTD6")
                .provider("anythingLLM")
                .build();

        //大模型决策关键点
        String text = ",这个问题的答案是否需要调用function tool?请回答“需要”或者“不需要”,不要回答其他额外的文字";

        System.err.println("顾客您好! 请问您有什么需要?");
        Scanner scanner = new Scanner(System.in);
        String userInput = scanner.nextLine();
        while (StrUtil.isNotEmpty(userInput)) {

            String content = JSONUtil.createObj()
                    .set("input", JSONUtil.createObj().set("prompt", userInput).toString())
                    .toString();

            //大模型决策
            ChatResponse decision = chatModel.prompt(ChatMessage.ofUser(content + text)).call();

            //如果需要调用function tool,则调用function tool来回答
            if ("需要".equals(decision.getMessage().getContent())) {
                ChatResponse response = chatModel.prompt(userInput).call();
                System.err.println("function answer:" + response.getMessage().getContent());
                //等待用户输入
                userInput = scanner.nextLine();
                continue;
            }

            //如果不需要则用知识库来回答
            ChatResponse responseAnything = anythingModel.prompt(userInput).call();
            System.err.println("anything answer:" + responseAnything.getMessage().getContent());

            if ("q".equals(userInput)) {
                break;
            }

            //等待用户输入
            userInput = scanner.nextLine();

        }

        System.err.println("对话结束!");
    }
}

      测试结果:

完整的示例代码:

solon-ai-examples: solon ai模块的简单使用示例,基本function call来实现大模型的决策功能

Logo

Agent 垂直技术社区,欢迎活跃、内容共建。

更多推荐