Vue3流式AI聊天界面开发:Deepseek/OpenAI API对接实战指南
2025.09.25 23:27浏览量:0简介:本文详解如何使用Vue3构建仿Deepseek/ChatGPT的流式聊天界面,并完成与Deepseek/OpenAI API的对接,涵盖界面设计、流式响应处理及API调用的完整实现。
一、技术选型与项目初始化
1.1 技术栈选择
Vue3作为前端框架的核心选择,其组合式API与TypeScript支持为复杂交互提供了良好基础。搭配Vite构建工具可显著提升开发效率,而Pinia作为状态管理库能简化跨组件通信。后端API对接需支持SSE(Server-Sent Events)协议,这是实现流式响应的关键技术。
1.2 项目初始化流程
npm create vue@latest ai-chat-demo -- --template vue3-ts
cd ai-chat-demo
npm install pinia axios @vueuse/core
项目结构建议采用模块化设计:
src/
├── api/ # API请求封装
├── components/ # 通用组件
├── composables/ # 组合式函数
├── router/ # 路由配置
├── stores/ # Pinia状态管理
└── views/ # 页面组件
二、流式聊天界面实现
2.1 核心组件设计
消息气泡组件需支持双向显示,通过props传递消息类型与内容:
<!-- MessageBubble.vue -->
<template>
<div :class="['bubble', { 'user': isUser }]">
<div v-if="!isStreaming" class="content">{{ content }}</div>
<div v-else class="streaming">
<span v-for="i in 5" :key="i" class="dot"></span>
</div>
</div>
</template>
<style scoped>
.bubble {
max-width: 70%;
margin: 8px;
padding: 12px;
border-radius: 18px;
}
.user {
margin-left: auto;
background: #4a6bff;
color: white;
}
.streaming .dot {
display: inline-block;
width: 8px;
height: 8px;
border-radius: 50%;
background: #fff;
animation: bounce 1.4s infinite ease-in-out;
}
/* 省略动画细节 */
</style>
2.2 消息流处理机制
采用EventSource实现SSE连接,需处理连接中断与重试逻辑:
// api/chat.ts
export const streamChat = async (prompt: string) => {
const eventSource = new EventSource(`/api/chat/stream?prompt=${encodeURIComponent(prompt)}`);
return new Promise<string[]>((resolve, reject) => {
const chunks: string[] = [];
eventSource.onmessage = (event) => {
const data = JSON.parse(event.data);
if (data.finish_reason) {
eventSource.close();
resolve(chunks);
} else {
chunks.push(data.text);
}
};
eventSource.onerror = (error) => {
eventSource.close();
reject(error);
};
});
};
三、Deepseek/OpenAI API对接
3.1 API请求封装
创建统一的API服务层,处理认证与错误重试:
// api/service.ts
import axios from 'axios';
const apiClient = axios.create({
baseURL: import.meta.env.VITE_API_BASE_URL,
timeout: 30000,
});
apiClient.interceptors.request.use((config) => {
const token = localStorage.getItem('api_token');
if (token) {
config.headers.Authorization = `Bearer ${token}`;
}
return config;
});
export const callApi = async <T>(url: string, config?: AxiosRequestConfig) => {
try {
const response = await apiClient.get<T>(url, config);
return response.data;
} catch (error) {
if (axios.isAxiosError(error) && error.response?.status === 401) {
// 处理认证错误
}
throw error;
}
};
3.2 流式响应处理
Deepseek API的流式响应格式与OpenAI类似,但存在字段差异:
// 处理Deepseek响应
const processDeepseekStream = (event: MessageEvent) => {
const data = JSON.parse(event.data);
// Deepseek可能使用不同字段名
const text = data.result?.content || data.choices[0]?.text;
if (text) appendMessage(text);
};
// 处理OpenAI响应
const processOpenAIStream = (event: MessageEvent) => {
const data = JSON.parse(event.data);
const delta = data.choices[0]?.delta?.content;
if (delta) appendMessage(delta);
};
四、性能优化与用户体验
4.1 虚拟滚动实现
当消息量超过50条时,使用虚拟滚动提升性能:
<!-- ChatContainer.vue -->
<template>
<div ref="container" class="chat-container">
<div :style="{ height: totalHeight + 'px' }" class="scroll-wrapper">
<div :style="{ transform: `translateY(${offset}px)` }" class="content">
<MessageBubble
v-for="(msg, index) in visibleMessages"
:key="index"
:content="msg.content"
:is-user="msg.isUser"
/>
</div>
</div>
</div>
</template>
<script setup>
const container = ref<HTMLElement>();
const visibleCount = 20; // 可见消息数
const messages = ref<Array<{content: string, isUser: boolean}>>([]);
const visibleMessages = computed(() => {
const start = Math.max(0, messages.value.length - visibleCount);
return messages.value.slice(start);
});
</script>
4.2 错误处理机制
实现三级错误处理体系:
- 网络层:重试机制(最多3次)
- API层:统一错误码处理
- 界面层:用户友好的错误提示
// composables/useError.ts
export const useError = () => {
const error = ref<string | null>(null);
const handleApiError = (e: unknown) => {
if (axios.isAxiosError(e)) {
switch (e.response?.status) {
case 401: error.value = '请重新登录'; break;
case 429: error.value = '请求过于频繁,请稍后再试'; break;
default: error.value = '服务暂时不可用';
}
} else {
error.value = '网络连接异常';
}
setTimeout(() => error.value = null, 3000);
};
return { error, handleApiError };
};
五、部署与扩展建议
5.1 环境变量配置
# .env.production
VITE_API_BASE_URL=https://api.deepseek.com/v1
VITE_API_TYPE=deepseek # 或openai
VITE_API_KEY=your_api_key
5.2 多模型支持实现
通过策略模式实现API适配:
// api/modelStrategy.ts
interface ChatModel {
streamChat(prompt: string): Promise<string[]>;
}
class DeepseekModel implements ChatModel {
async streamChat(prompt: string) {
// Deepseek特定实现
}
}
class OpenAIModel implements ChatModel {
async streamChat(prompt: string) {
// OpenAI特定实现
}
}
export const createModel = (type: string): ChatModel => {
switch (type) {
case 'deepseek': return new DeepseekModel();
case 'openai': return new OpenAIModel();
default: throw new Error('Unsupported model');
}
};
5.3 安全增强措施
- 输入过滤:使用DOMPurify防止XSS攻击
- 速率限制:前端实现简单限流
- 数据加密:敏感操作使用HTTPS
// utils/security.ts
import DOMPurify from 'dompurify';
export const sanitizeInput = (text: string) => {
return DOMPurify.sanitize(text, { ALLOWED_TAGS: [] });
};
export const debounce = <T extends (...args: any[]) => any>(
func: T,
wait: number
) => {
let timeout: NodeJS.Timeout;
return (...args: Parameters<T>) => {
clearTimeout(timeout);
timeout = setTimeout(() => func(...args), wait);
};
};
六、完整实现示例
6.1 主组件整合
<!-- ChatView.vue -->
<template>
<div class="chat-wrapper">
<div class="message-list" ref="messageList">
<MessageBubble
v-for="(msg, index) in messages"
:key="index"
:content="msg.content"
:is-user="msg.isUser"
:is-streaming="index === messages.length - 1 && isStreaming"
/>
</div>
<div class="input-area">
<textarea v-model="input" @keydown.enter.prevent="handleSubmit" />
<button @click="handleSubmit" :disabled="isLoading">
{{ isLoading ? '发送中...' : '发送' }}
</button>
</div>
</div>
</template>
<script setup>
import { ref, onMounted } from 'vue';
import { useChatStore } from '@/stores/chat';
import { createModel } from '@/api/modelStrategy';
const chatStore = useChatStore();
const input = ref('');
const isLoading = ref(false);
const isStreaming = ref(false);
const messages = ref<Array<{content: string, isUser: boolean}>>([]);
const model = createModel(import.meta.env.VITE_API_TYPE);
const handleSubmit = async () => {
if (!input.value.trim()) return;
const userMsg = { content: input.value, isUser: true };
messages.value.push(userMsg);
input.value = '';
isLoading.value = true;
isStreaming.value = true;
try {
const chunks = await model.streamChat(userMsg.content);
const aiMsg = { content: chunks.join(''), isUser: false };
messages.value.push(aiMsg);
} catch (error) {
console.error('Chat error:', error);
} finally {
isLoading.value = false;
isStreaming.value = false;
}
};
</script>
6.2 状态管理实现
// stores/chat.ts
import { defineStore } from 'pinia';
export const useChatStore = defineStore('chat', {
state: () => ({
messages: [] as Array<{content: string, isUser: boolean}>,
isLoading: false,
}),
actions: {
addMessage(message: string, isUser: boolean) {
this.messages.push({ content: message, isUser });
},
clearMessages() {
this.messages = [];
},
},
});
七、总结与展望
本实现方案通过Vue3的组合式API构建了响应式的流式聊天界面,采用策略模式实现了对Deepseek和OpenAI API的灵活适配。关键优化点包括:
- 虚拟滚动技术处理大量消息
- 三级错误处理体系提升稳定性
- 环境变量配置实现多环境部署
未来可扩展方向:
- 增加多轮对话上下文管理
- 实现消息编辑与删除功能
- 添加插件系统支持自定义功能
- 集成语音输入输出功能
发表评论
登录后可评论,请前往 登录 或 注册