Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat:add ollama example #59

Closed
wants to merge 10 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
42 changes: 42 additions & 0 deletions spring-ai-alibaba-examples/ollama-example/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
# 如何本地部署大模型,这里用ollama来做示例
## 1. 下载ollama
### 进入Ollama官网 https://ollama.com/
下载对应自己的系统的ollama
## 2. 安装ollama
### 双击OllamaSetup.exe,进行安装
>注意,在windows下安装时,是不允许选择安装位置的,默认是安装在系统盘的
安装完毕后,打开终端进行验证,在终端中输入ollama,验证ollama是否安装成功
>
> 如果ollama安装成功,会显示ollama的版本信息
## 3. 设置模型存放目录
### 3.1、在windows,ollama安装的模型,默认存放目录为C:/Users//.ollama/models
可以通过以下命令更改模型安装时的存放目录
> 只设置当前用户(需要先创建D:\ollama_models目录)
setx OLLAMA_MODELS "D:\ollama_models"
### 3.2、重启终端
>setx命令在windows中设置环境变量时,这个变量的更改只会在新打开的命令提示符窗口或终端会话中生效
### 3.3、重启ollama服务
> 在终端中输入ollama
## 4.查看Ollama支持的模型
>https://ollama.com/library
>
>点击某个模型连接,比如llama2,可以看到模型详细的介绍
## 5.模型安装
>可以通过以下命令进行模型安装
>
> ollama pull llama2

下载过程比较慢,耐心等待
## 6.查看已安装的模型列表
通过以下命令查看已安装的模型列表
> ollama list
## 7.运行模型
> ollama run llama2

出现send Message就说明部署好啦,可以使用模型对话了
>退出模型命令
> /bye
>
## 温馨提示
本地部署的大模型默认端口为11434,访问地址为 http://127.0.0.1:11434
#### 可以通过修改环境变量来允许外部访问
78 changes: 78 additions & 0 deletions spring-ai-alibaba-examples/ollama-example/pom.xml
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>3.3.3</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.alibaba.cloud.ai</groupId>
<artifactId>ollama-example</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>ollama-example</name>
<description>Demo project for Spring AI Alibaba</description>

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<maven.compiler.source>17</maven.compiler.source>
<maven.compiler.target>17</maven.compiler.target>
<maven-deploy-plugin.version>3.1.1</maven-deploy-plugin.version>

</properties>

<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-bom</artifactId>
<version>1.0.0-M2</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>

<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-ollama-spring-boot-starter</artifactId>
<version>1.0.0-M2</version>
</dependency>
</dependencies>

<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-deploy-plugin</artifactId>
<version>${maven-deploy-plugin.version}</version>
<configuration>
<skip>true</skip>
</configuration>
</plugin>
</plugins>
</build>

<repositories>
<repository>
<id>spring-milestones</id>
<name>Spring Milestones</name>
<url>https://repo.spring.io/milestone</url>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
</repositories>

</project>
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
package com.alibaba.cloud.ai.example.ollama;

import org.springframework.ai.chat.model.ChatModel;
import org.springframework.ai.chat.model.ChatResponse;
import org.springframework.ai.chat.prompt.Prompt;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;


@RestController
@RequestMapping("/ai")
public class OllamaController {
private final ChatModel chatModel;
public OllamaController(ChatModel chatModel) {
this.chatModel = chatModel;
}
@GetMapping("/chat")
public String chat(String input) {
ChatResponse response = chatModel.call(new Prompt(input));
return response.getResult().getOutput().getContent();
}

}
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
package com.alibaba.cloud.ai.example.ollama;

import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

@SpringBootApplication
public class OllamaExampleApplication {

public static void main(String[] args) {
SpringApplication.run(OllamaExampleApplication.class, args);
}

}
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
server:
port: 8090
spring:
application:
name: ollama-example
ai:
ollama:
base-url: http://127.0.0.1:11434
chat:
options:
model: llama2:latest
temperature: 0.4F
1 change: 1 addition & 0 deletions spring-ai-alibaba-examples/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,7 @@
<module>rag-example</module>
<module>output-parser-example</module>
<module>playground-flight-booking</module>
<module>ollama-example</module>
</modules>

<build>
Expand Down