Compare commits

...

125 Commits

Author SHA1 Message Date
Tim
f773d17748 fix: mark all unread notifications across all pages 2026-02-06 17:16:02 +08:00
Tim
15d36709c3 Merge pull request #1137 from nagisa77/feature/agents-guidelines
chore: add AGENTS guides for root and submodules
2026-02-06 15:11:42 +08:00
Tim
1978490f1d chore: add AGENTS guides for root and submodules 2026-02-06 15:09:43 +08:00
Tim
680e44e743 feat: 给出仅重启后台的vibe coding方式 2026-02-04 20:23:19 +08:00
Tim
657b76bb1e feat: 评论分页 2026-02-04 20:14:46 +08:00
Tim
f695db62c6 Clear filters on logo click 2026-02-04 19:03:58 +08:00
Tim
e5b386cdc2 Merge pull request #1136 from nagisa77/bugfix/1053
fix: 性能优化,首页下拉更新,实测6秒左右,稍慢 #1053
2026-01-16 16:47:46 +08:00
Tim
179699dd66 Merge pull request #1135 from nagisa77/bugfix/1130
fix: 1. 修复dev_local_backend报错问题 2.由于内存占用过高,直接去除opensearch
2026-01-16 16:45:35 +08:00
Tim
ef39b5fedf fix: 性能优化,首页下拉更新,实测6秒左右,稍慢 #1053 2026-01-16 16:45:07 +08:00
Tim
e13ee1ca46 fix: 1. 修复dev_local_backend报错问题 2.由于内存占用过高,直接去除opensearch 2026-01-16 15:22:59 +08:00
Tim
09f1435e33 Merge pull request #1134 from nagisa77/feature/view_history
fix: add view history logic
2026-01-16 15:13:22 +08:00
Tim
7e7cebbbe7 fix: add view history logic 2026-01-16 15:05:06 +08:00
Tim
5c1031c57c Merge pull request #1133 from nagisa77/feature/fix_copy_lick_background
feat: update copylink color
2026-01-16 14:21:02 +08:00
Tim
e6730b2882 feat: update copylink color 2026-01-16 14:17:23 +08:00
Tim
21b1c3317a fix: 解决邮件发送错误,但是前端显示已发送的问题 2026-01-16 11:12:20 +08:00
Tim
72a915af2e fix: 解决icon loading问题 2026-01-15 21:33:55 +08:00
Tim
f000011994 fix: 解决文章loading问题 2026-01-15 21:29:44 +08:00
Tim
d48c9dc27a fix: 新增描述初始用户名和密码 2026-01-15 21:16:58 +08:00
Tim
94f955e50f 更新 daily_news_bot.ts 2025-10-31 11:48:22 +08:00
Tim
bf94707914 Update daily_news_bot.ts 2025-10-31 10:54:36 +08:00
Tim
209f0ef1f8 Update daily_news_bot.ts 2025-10-31 10:50:53 +08:00
Tim
e2d900759a Merge pull request #1126 from nagisa77/codex/set-news-bot-model-to-4o
Allow bots to choose agent model
2025-10-31 10:46:55 +08:00
Tim
40a233a66b Allow bots to override agent model 2025-10-31 10:46:22 +08:00
Tim
b8c0b1c6f8 Update bot_father.ts 2025-10-31 10:29:41 +08:00
Tim
b37df67d31 Update daily_news_bot instructions for content sourcing 2025-10-30 10:09:03 +08:00
Tim
90865b02c9 更新 news-bot.yml 2025-10-30 09:40:32 +08:00
Tim
f8c0335982 fix: prompt 调整 2025-10-29 18:32:04 +08:00
Tim
20b3d89a00 fix: prompt 调整 2025-10-29 18:28:33 +08:00
Tim
ddae56d483 fix: prompt 2025-10-29 18:12:46 +08:00
Tim
265fce4153 feat: add workflow 2025-10-29 18:03:49 +08:00
Tim
cc0880e2c1 Merge pull request #1125 from nagisa77/feature/news_bot
Feature/news bot
2025-10-29 18:02:19 +08:00
Tim
5fe3eec815 Merge pull request #1124 from nagisa77/codex/create-daily-news-bot-with-summaries
Add daily news bot workflow
2025-10-29 18:01:21 +08:00
Tim
f0feb7a45c Add daily news bot workflow 2025-10-29 18:01:08 +08:00
Tim
784057207f feat: add websearch tools 2025-10-29 17:54:06 +08:00
Tim
bed72662b5 Update reply_bot.ts 2025-10-29 14:31:07 +08:00
Tim
895dba495b Update reply_bot.ts 2025-10-29 14:30:14 +08:00
Tim
32dc6bfaf9 Update reply_bot.ts 2025-10-29 14:25:59 +08:00
Tim
4766250577 Update reply_bot.ts 2025-10-29 14:21:27 +08:00
Tim
13baffa9f1 Merge pull request #1123 from nagisa77/codex/rename-coffee-and-reply-bots-to-system
Adjust coffee bot schedule and update bot personas
2025-10-29 14:20:40 +08:00
Tim
d0d7580ac3 Adjust bot identities and coffee bot schedule 2025-10-29 14:20:29 +08:00
Tim
fd4e651a49 Merge pull request #1122 from nagisa77/codex/update-readme.md-with-bot-integration-details
docs: highlight bot integration in README
2025-10-29 13:15:17 +08:00
Tim
58317687d7 docs: update README bot info 2025-10-29 13:15:06 +08:00
tim
006e46f4ef Revert "fix: prompt 修改"
This reverts commit 2c27766544.
2025-10-28 21:56:49 +08:00
Tim
2c27766544 fix: prompt 修改 2025-10-28 20:48:37 +08:00
Tim
c305992223 fix: prompt 修改 2025-10-28 20:25:07 +08:00
Tim
babd2c6549 Merge pull request #1121 from nagisa77/codex/add-opensourcereplybot-with-detailed-responses
feat(bot): add open source reply bot
2025-10-28 19:56:03 +08:00
Tim
d98c3644a6 fix: 添加GitHub action 2025-10-28 19:55:29 +08:00
Tim
dbb63a4039 feat(bot): add open source reply bot 2025-10-28 19:53:07 +08:00
Tim
49aeff3a83 Merge pull request #1120 from nagisa77/codex/add-is_bot-field-to-user-table
Add bot flag to user model and show comment badge
2025-10-28 19:49:44 +08:00
Tim
512e5623e1 Add bot flag to users and surface in comments 2025-10-28 19:49:33 +08:00
Tim
8db928b9a8 Merge pull request #1119 from nagisa77/codex/update-lottery-time-from-23-to-15
Adjust coffee bot draw schedule
2025-10-28 18:58:05 +08:00
Tim
46f6ccb3a8 Adjust coffee draw schedule 2025-10-28 18:57:53 +08:00
Tim
87dcebf052 fix: tools 重写 2025-10-28 18:50:34 +08:00
Tim
0ad4f4feff fix: fix tools 2025-10-28 18:48:32 +08:00
Tim
a227ac77fb Merge pull request #1118 from nagisa77/codex/add-daily-weather-lookup-for-cities
fix: 透传token
2025-10-28 18:45:30 +08:00
Tim
ef53a40ed5 fix: 透传token 2025-10-28 18:44:43 +08:00
Tim
7d8c9b68bd Merge pull request #1117 from nagisa77/codex/add-daily-weather-lookup-for-cities
Integrate weather MCP into Coffee Bot
2025-10-28 18:43:15 +08:00
Tim
dbc3d54fa1 fix: should add weather mcp 2025-10-28 18:42:47 +08:00
Tim
4c0b9e744a feat: weather mcp 2025-10-28 18:39:58 +08:00
Tim
4b4d1a2a86 Merge branch 'main' into codex/add-daily-weather-lookup-for-cities
# Conflicts:
#	bots/instance/coffee_bot.ts
2025-10-28 18:39:44 +08:00
Tim
6990aa93ed Integrate weather MCP into Coffee Bot 2025-10-28 18:32:42 +08:00
Tim
421b8b6b4f fix: prompt 修改 2025-10-28 18:08:19 +08:00
Tim
e55acc6dc4 fix: 解决时区问题 2025-10-28 18:01:03 +08:00
Tim
33ce56aa31 fix: 解决时区问题 2025-10-28 17:58:13 +08:00
Tim
339c39c6ca fix: 时区设置 2025-10-28 17:55:36 +08:00
Tim
389961c922 fix: 修正prompt 2025-10-28 17:47:00 +08:00
Tim
6db53274fb fix: 修正日期 2025-10-28 17:44:14 +08:00
Tim
a413c0be35 fix: 修正语法 2025-10-28 17:35:01 +08:00
Tim
06ecd39c8b Merge branch 'main' of github.com:nagisa77/OpenIsle
# Conflicts:
#	bots/instance/coffee_bot.ts
2025-10-28 17:34:47 +08:00
Tim
f0ba00b7e8 fix: 修正抽奖贴问题 2025-10-28 17:33:23 +08:00
Tim
092c4c36c2 Merge pull request #1116 from nagisa77/codex/create-post-for-coffee-bot-with-categories-and-tags
Update coffee bot post metadata instructions
2025-10-28 17:27:52 +08:00
Tim
db13f8145d Update coffee bot post metadata instructions 2025-10-28 17:27:37 +08:00
Tim
3be396976a Merge pull request #1115 from nagisa77/codex/fix-coffee-bot-mcp-service-error
Validate MCP post creation inputs
2025-10-28 17:23:21 +08:00
Tim
3fbaa332fc Validate required fields for MCP post creation 2025-10-28 17:22:58 +08:00
Tim
4e6cb59753 fix: 修正语法问题 2025-10-28 16:49:44 +08:00
Tim
1c6c17e577 fix: 修正语法问题 2025-10-28 16:47:00 +08:00
Tim
c968efa42a Revert "Fix search client reply argument order"
This reverts commit 7a2cf829c7.
2025-10-28 16:38:53 +08:00
Tim
0cd5ded39b Merge pull request #1114 from nagisa77/codex/fix-syntaxerror-in-search_client.py
Fix reply methods argument order in MCP search client
2025-10-28 16:33:05 +08:00
Tim
7a2cf829c7 Fix search client reply argument order 2025-10-28 16:32:53 +08:00
Tim
12329b43d1 Merge pull request #1113 from nagisa77/codex/pass-bearer-token-to-backend-apis
feat: forward Authorization header for MCP backend requests
2025-10-28 16:22:17 +08:00
Tim
1a45603e0f feat: forward authorization headers to backend 2025-10-28 16:22:04 +08:00
Tim
4a73503399 Merge pull request #1112 from nagisa77/codex/fix-authorization-parameter-error
Fix hosted MCP authorization configuration
2025-10-28 16:00:07 +08:00
Tim
83bf8c1d5e Fix hosted MCP auth header collision 2025-10-28 15:59:57 +08:00
Tim
34e206f05d Merge pull request #1111 from nagisa77/codex/refactor-getbaseinstructions-to-remove-openisletoken
Inject OpenIsle token into MCP requests
2025-10-28 15:57:04 +08:00
Tim
dc349923e9 Inject OpenIsle token into MCP requests 2025-10-28 15:56:51 +08:00
Tim
0d44c9a823 Merge pull request #1110 from nagisa77/codex/update-coffee-bot-prize-image
Update coffee bot prize image instructions
2025-10-28 15:21:23 +08:00
Tim
02645af321 Update coffee bot prize image instructions 2025-10-28 15:21:03 +08:00
Tim
c3a175f13f Merge pull request #1109 from nagisa77/codex/create-coffee-bot-for-lottery-post
Add coffee bot lottery poster and schedule
2025-10-28 15:14:51 +08:00
Tim
0821d447f7 Add coffee bot lottery poster and schedule 2025-10-28 15:14:37 +08:00
Tim
257794ca00 feat: bot father 允许创建帖子 2025-10-28 15:12:53 +08:00
Tim
6a527de3eb Merge pull request #1108 from nagisa77/codex/update-getadditionalinstructions-for-reply_bot
Enhance reply bot persona with site background
2025-10-28 15:07:27 +08:00
Tim
2313f90eb3 Enhance reply bot persona with site background 2025-10-28 15:07:14 +08:00
Tim
7fde984e7d Merge pull request #1107 from nagisa77/codex/add-posting-tool-support-for-mcp-service
Add MCP tool for creating posts
2025-10-28 15:06:09 +08:00
Tim
fc41e605e4 Add MCP tool for creating posts 2025-10-28 15:05:55 +08:00
Tim
042e5fdbe6 fix: make a bot father 2025-10-28 15:01:33 +08:00
Tim
629442bff6 fix: make a bot father 2025-10-28 14:58:38 +08:00
Tim
7798910be0 Merge pull request #1106 from nagisa77/codex/refactor-bots-directory-to-oop
Refactor reply bots to extend BotFather base class
2025-10-28 14:55:02 +08:00
Tim
6f036eb4fe Refactor reply bot with BotFather base class 2025-10-28 14:54:49 +08:00
Tim
56fc05cb3c fix: 新增环境 2025-10-28 14:15:03 +08:00
Tim
a55a15659b fix: 解决脚本失败问题 2025-10-28 14:11:29 +08:00
Tim
ccf6e0c7ce Merge pull request #1104 from nagisa77/feature/bot
Feature/bot
2025-10-28 13:57:23 +08:00
Tim
87677f5968 Merge pull request #1103 from nagisa77/codex/add-git-action-to-run-reply_bots.ts
Add scheduled workflow to run reply bots
2025-10-28 13:56:31 +08:00
Tim
fd93a2dc61 Add scheduled reply bot workflow 2025-10-28 13:56:18 +08:00
Tim
80f862a226 Merge pull request #1102 from nagisa77/codex/add-read-cleanup-interface-for-mcp
Add MCP support for clearing read notifications
2025-10-28 13:50:33 +08:00
Tim
26bb85f4d4 Add MCP support for clearing read notifications 2025-10-28 13:50:16 +08:00
tim
398b4b482f fix: prompt 完善 2025-10-28 13:00:42 +08:00
tim
2cfb302981 fix: add bot 2025-10-28 12:37:17 +08:00
Tim
e75bd76b71 Merge pull request #1101 from nagisa77/codex/fix-unread-notifications-data-format
Normalize null list payloads in notification schemas
2025-10-28 10:27:54 +08:00
Tim
99c3ac1837 Handle null list fields in notification schemas 2025-10-28 10:27:40 +08:00
tim
749ab560ff Revert "Cache MCP session JWT tokens"
This reverts commit 997dacdbe6.
2025-10-28 01:55:46 +08:00
tim
541ad4d149 Revert "Remove token parameters from MCP tools"
This reverts commit e585100625.
2025-10-28 01:55:41 +08:00
tim
03eb027ea4 Revert "Add MCP tool for setting session token"
This reverts commit 9dadaad5ba.
2025-10-28 01:55:36 +08:00
Tim
4194b2be91 Merge pull request #1100 from nagisa77/codex/add-initialization-tool-for-jwt-token
Add MCP tool for initializing session JWT tokens
2025-10-28 01:47:28 +08:00
Tim
9dadaad5ba Add MCP tool for setting session token 2025-10-28 01:47:16 +08:00
Tim
d4b3400c5f Merge pull request #1099 from nagisa77/codex/remove-token-parameters-from-mcp-api
Remove explicit token parameters from MCP tools
2025-10-28 01:32:18 +08:00
Tim
e585100625 Remove token parameters from MCP tools 2025-10-28 01:32:02 +08:00
Tim
e94471b53e Merge pull request #1098 from nagisa77/codex/store-accesstoken-as-jwt-token
Cache MCP session JWT tokens
2025-10-28 01:20:52 +08:00
Tim
997dacdbe6 Cache MCP session JWT tokens 2025-10-28 01:20:32 +08:00
Tim
c01349a436 Merge pull request #1097 from nagisa77/codex/improve-python-mcp-logs
Improve MCP logging and add unread notification tool
2025-10-28 01:01:47 +08:00
Tim
4cf48f9157 Enhance MCP logging and add unread message tool 2025-10-28 01:01:25 +08:00
Tim
796afbe612 Merge pull request #1096 from nagisa77/codex/add-reply-support-for-posts
Add MCP support for replying to posts
2025-10-27 20:24:59 +08:00
Tim
dca14390ca Add MCP tool for replying to posts 2025-10-27 20:24:47 +08:00
Tim
39875acd35 Merge pull request #1095 from nagisa77/codex/add-post-query-interface-in-mcp-iq4fxi
Add MCP tool for retrieving post details
2025-10-27 20:19:29 +08:00
Tim
62edc75735 feat(mcp): add post detail retrieval tool 2025-10-27 20:19:17 +08:00
Tim
26ca9fc916 Merge pull request #1093 from nagisa77/codex/add-reply-and-recent-post-query-apis 2025-10-27 16:13:22 +08:00
63 changed files with 3066 additions and 209 deletions

30
.github/workflows/coffee-bot.yml vendored Normal file
View File

@@ -0,0 +1,30 @@
name: Coffee Bot
on:
schedule:
- cron: "0 23 * * 0-4"
workflow_dispatch:
jobs:
run-coffee-bot:
environment: Bots
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: "20"
cache: "npm"
- name: Install dependencies
run: npm install --no-save @openai/agents tsx typescript
- name: Run coffee bot
env:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
OPENISLE_TOKEN: ${{ secrets.OPENISLE_TOKEN }}
APIFY_API_TOKEN: ${{ secrets.APIFY_API_TOKEN }}
run: npx tsx bots/instance/coffee_bot.ts

30
.github/workflows/news-bot.yml vendored Normal file
View File

@@ -0,0 +1,30 @@
name: Daily News Bot
on:
schedule:
- cron: "0 22 * * 0-4"
workflow_dispatch:
jobs:
run-daily-news-bot:
environment: Bots
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: "20"
cache: "npm"
- name: Install dependencies
run: npm install --no-save @openai/agents tsx typescript
- name: Run daily news bot
env:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
OPENISLE_TOKEN: ${{ secrets.OPENISLE_TOKEN }}
APIFY_API_TOKEN: ${{ secrets.APIFY_API_TOKEN }}
run: npx tsx bots/instance/daily_news_bot.ts

View File

@@ -0,0 +1,30 @@
name: Open Source Reply Bot
on:
schedule:
- cron: "*/30 * * * *"
workflow_dispatch:
jobs:
run-open-source-reply-bot:
environment: Bots
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: "20"
cache: "npm"
- name: Install dependencies
run: npm install --no-save @openai/agents tsx typescript
- name: Run open source reply bot
env:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
OPENISLE_TOKEN: ${{ secrets.OPENISLE_TOKEN_BOT_1 }}
APIFY_API_TOKEN: ${{ secrets.APIFY_API_TOKEN }}
run: npx tsx bots/instance/open_source_reply_bot.ts

30
.github/workflows/reply-bots.yml vendored Normal file
View File

@@ -0,0 +1,30 @@
name: Reply Bots
on:
schedule:
- cron: "*/30 * * * *"
workflow_dispatch:
jobs:
run-reply-bot:
environment: Bots
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: "20"
cache: "npm"
- name: Install dependencies
run: npm install --no-save @openai/agents tsx typescript
- name: Run reply bot
env:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
OPENISLE_TOKEN: ${{ secrets.OPENISLE_TOKEN }}
APIFY_API_TOKEN: ${{ secrets.APIFY_API_TOKEN }}
run: npx tsx bots/instance/reply_bot.ts

68
AGENTS.md Normal file
View File

@@ -0,0 +1,68 @@
# OpenIsle 协作指引(根目录)
## 1) 适用范围与优先级
- 本文件作用于整个仓库。
- 若子目录存在 `AGENTS.md`,以“最近目录”的规则为准(就近覆盖)。
- 没有子目录规则时,回退到本文件。
## 2) 仓库地图(高频模块)
- `backend/`Spring Boot 主后端JPA、Redis、RabbitMQ、OpenSearch、OpenAPI
- `frontend_nuxt/`Nuxt 3 前端SSR + WebSocket + OAuth 回调页)。
- `websocket_service/`:独立 WebSocket/STOMP 服务(消费 RabbitMQ 通知并推送)。
- `mcp/`Python MCP Server封装搜索、发帖、回复、通知等工具
- `docs/`Fumadocs 文档站(含 OpenAPI 生成)。
- `deploy/`:生产/预发部署脚本。
- `bots/`:定时 Bot 脚本(由 GitHub Actions 调度)。
## 3) 全局工作原则
- 小步修改:仅改当前任务相关文件,不做无关重构。
- 先对齐契约,再改实现:优先保证 API、消息结构、环境变量名称稳定。
- 变更需可验证:提交前给出最小可执行验证命令与结果。
- 安全优先:禁止提交密钥、令牌、生产凭证。
## 4) 任务路由(先判定影响面)
- 仅后端业务逻辑:进入 `backend/AGENTS.md`
- 仅前端页面交互:进入 `frontend_nuxt/AGENTS.md`
- 实时消息/在线通知:同时看 `backend/` + `websocket_service/` + `frontend_nuxt/`
- API/DTO/鉴权变更:至少看 `backend/` + `mcp/` + `docs/`
- 部署链路变更:进入 `deploy/AGENTS.md`,并附风险与回滚方案。
## 5) 跨服务不变量(必须遵守)
- 环境变量以根目录 `.env.example` 为统一基线;变量改名需同步消费端。
- 鉴权链路一致:
- 后端 JWT`backend/src/main/java/com/openisle/config/SecurityConfig.java`
- WebSocket JWT`websocket_service/src/main/java/com/openisle/websocket/config/WebSocketAuthInterceptor.java`
- RabbitMQ 分片一致16 个十六进制分片 + 兼容遗留队列):
- `backend/src/main/java/com/openisle/config/RabbitMQConfig.java`
- `backend/src/main/java/com/openisle/config/ShardingStrategy.java`
- `websocket_service/src/main/java/com/openisle/websocket/listener/NotificationListener.java`
- API 契约变更要同步:
- MCP schema/tool`mcp/src/openisle_mcp/schemas.py``mcp/src/openisle_mcp/server.py`
- 文档生成:`docs/scripts/generate-docs.ts``docs/lib/openapi.ts`
## 6) 最小验证矩阵(按改动类型)
- 后端改动:`cd backend && mvn test`
- 前端改动:`cd frontend_nuxt && npm run build`
- WebSocket 服务改动:`cd websocket_service && mvn test`(无测试可退化为 `mvn -DskipTests compile`
- MCP 改动:`cd mcp && python -m pip install -e .`
- Docs/OpenAPI 改动:`cd docs && bun run generate && bun run build`
- 部署脚本改动:`bash -n deploy/deploy.sh && bash -n deploy/deploy_staging.sh`
## 7) 交付清单(每次任务输出建议)
- 改了什么:按文件列出核心变更点。
- 为什么改:说明触发原因/缺陷点/一致性要求。
- 如何验证:给出实际执行命令与结果摘要。
- 风险与后续:列出剩余风险、可选回归点、回滚建议。
## 8) 禁止事项
- 不提交 `.env`、密钥、生产 token。
- 不在未明确授权下执行破坏性命令(如大范围删除、强制重置)。
- 不在无关文件中进行格式化/重排以“顺手优化”。

View File

@@ -57,6 +57,17 @@ cd OpenIsle
--profile dev up -d --force-recreate
```
仅重启后端容器(不重建镜像、不影响前端):
```shell
docker compose \
-f docker/docker-compose.yaml \
--env-file .env \
--profile dev restart springboot websocket-service
```
数据初始化sql会创建几个帐户供大家测试使用
> username:admin/user1/user2 password:123456
3. 查看服务状态:
```shell
docker compose -f docker/docker-compose.yaml --env-file .env ps

View File

@@ -26,7 +26,7 @@ OpenIsle 是一个使用 Spring Boot 和 Vue 3 构建的全栈开源社区平台
- 集成 OpenAI 提供的 Markdown 格式化功能
- 通过环境变量可调整密码强度、登录方式、保护码等多种配置
- 支持图片上传,默认使用腾讯云 COS 扩展
- 默认头像使用 DiceBear Avatars可通过 `AVATAR_STYLE``AVATAR_SIZE` 环境变量自定义主题和大小
- Bot 集成,可在平台内快速连接自定义机器人,并通过 Telegram 的 BotFather 创建和管理消息机器人,拓展社区互动渠道
- 浏览器推送通知,离开网站也能及时收到提醒
## 🌟 项目优势
@@ -41,7 +41,7 @@ OpenIsle 是一个使用 Spring Boot 和 Vue 3 构建的全栈开源社区平台
## 🏘️ 社区
欢迎彼此交流和使用 OpenIsle项目以开源方式提供,想了解更多可访问:<https://github.com/nagisa77/OpenIsle>
- 欢迎彼此交流和使用 OpenIsle项目以开源方式提供;如果遇到问题请到 GitHub 的 Issues 页面反馈,想发起话题讨论也可以前往源站 <https://www.open-isle.com>,这里提供更完整的社区板块与互动体验。
## 📋 授权

59
backend/AGENTS.md Normal file
View File

@@ -0,0 +1,59 @@
# Backend 协作指引
## 1) 适用范围
- 作用于 `backend/` 目录及其子目录。
- 若与根 `AGENTS.md` 冲突,以本文件为准(仅后端范围)。
## 2) 代码结构心智模型
- `controller/`:接口层(入参校验、权限边界、响应格式)。
- `service/`:业务编排与领域规则(核心逻辑放这里)。
- `repository/`JPA 数据访问(基于实体与查询方法)。
- `model/`:实体模型与枚举。
- `dto/` + `mapper/`:对外契约和映射转换。
- `config/`安全、缓存、MQ、OpenAPI、初始化器等基础设施配置。
- `search/`OpenSearch 索引与事件驱动同步。
## 3) 后端修改规则
- 控制器保持“薄”,复杂逻辑下沉到 `service/`
- DTO 变更优先考虑兼容性,避免无版本的破坏性字段删除/改名。
- 新增接口时:
- 补齐必要的鉴权规则(`SecurityConfig`)。
- 补齐 OpenAPI 注解(`@Operation``@ApiResponse` 等)。
- 涉及缓存时,确认 `CachingConfig` 中缓存名、TTL 与失效策略一致。
## 4) 重点一致性检查
- 鉴权与公开接口:
- `src/main/java/com/openisle/config/SecurityConfig.java`
- 搜索索引同步(实体字段/文案变更时):
- `src/main/java/com/openisle/search/SearchDocumentFactory.java`
- `src/main/java/com/openisle/search/SearchIndexEventPublisher.java`
- 消息通知链路(评论/通知相关):
- `src/main/java/com/openisle/config/RabbitMQConfig.java`
- `src/main/java/com/openisle/config/ShardingStrategy.java`
- `src/main/java/com/openisle/service/NotificationProducer.java`
- 环境变量消费面:
- `src/main/resources/application.properties`
- 根目录 `.env.example`
## 5) 数据与事务注意事项
- 涉及多表写入时,明确事务边界,避免半成功状态。
- 避免在 Controller 直接操作 Repository。
- JPA 懒加载对象对外返回前应通过 DTO 映射,避免序列化副作用。
## 6) 测试与验证
- 首选全量:`mvn test`
- 变更集中时可先跑目标测试(示例):
- `mvn -Dtest=PostControllerTest test`
- `mvn -Dtest=SearchServiceTest test`
- 涉及搜索/MQ 配置时,至少完成一次启动级验证或关键集成测试。
## 7) 输出要求
- 明确列出“接口/字段/权限/事件”是否发生变化。
- 若影响 `mcp/``docs/`,在结果中显式提示需同步改动。

View File

@@ -6,10 +6,12 @@ import com.openisle.model.User;
import com.openisle.repository.NotificationRepository;
import com.openisle.repository.UserRepository;
import com.openisle.service.EmailSender;
import com.openisle.exception.EmailSendException;
import io.swagger.v3.oas.annotations.Operation;
import io.swagger.v3.oas.annotations.responses.ApiResponse;
import io.swagger.v3.oas.annotations.security.SecurityRequirement;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
@@ -17,6 +19,7 @@ import org.springframework.web.bind.annotation.*;
@RestController
@RequestMapping("/api/admin/users")
@RequiredArgsConstructor
@Slf4j
public class AdminUserController {
private final UserRepository userRepository;
@@ -35,11 +38,15 @@ public class AdminUserController {
user.setApproved(true);
userRepository.save(user);
markRegisterRequestNotificationsRead(user);
emailSender.sendEmail(
user.getEmail(),
"您的注册已审核通过",
"🎉您的注册已审核通过, 点击以访问网站: " + websiteUrl
);
try {
emailSender.sendEmail(
user.getEmail(),
"您的注册已审核通过",
"🎉您的注册已经审核通过, 点击以访问网站: " + websiteUrl
);
} catch (EmailSendException e) {
log.warn("Failed to send approve email to {}: {}", user.getEmail(), e.getMessage());
}
return ResponseEntity.ok().build();
}
@@ -52,11 +59,15 @@ public class AdminUserController {
user.setApproved(false);
userRepository.save(user);
markRegisterRequestNotificationsRead(user);
emailSender.sendEmail(
user.getEmail(),
"您的注册已被管理员拒绝",
"您的注册被管理员拒绝, 点击链接可以重新填写理由申请: " + websiteUrl
);
try {
emailSender.sendEmail(
user.getEmail(),
"您的注册被管理员拒绝",
"您的注册被管理员拒绝, 点击链接可以重新填写理由申请: " + websiteUrl
);
} catch (EmailSendException e) {
log.warn("Failed to send reject email to {}: {}", user.getEmail(), e.getMessage());
}
return ResponseEntity.ok().build();
}

View File

@@ -2,6 +2,7 @@ package com.openisle.controller;
import com.openisle.config.CachingConfig;
import com.openisle.dto.*;
import com.openisle.exception.EmailSendException;
import com.openisle.exception.FieldException;
import com.openisle.model.RegisterMode;
import com.openisle.model.User;
@@ -19,6 +20,7 @@ import java.util.concurrent.TimeUnit;
import lombok.RequiredArgsConstructor;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.data.redis.core.RedisTemplate;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
@@ -83,6 +85,17 @@ public class AuthController {
"INVITE_APPROVED"
)
);
} catch (EmailSendException e) {
return ResponseEntity
.status(HttpStatus.INTERNAL_SERVER_ERROR)
.body(
Map.of(
"error",
"邮件发送失败: " + e.getMessage(),
"reason_code",
"EMAIL_SEND_FAILED"
)
);
} catch (FieldException e) {
return ResponseEntity.badRequest().body(
Map.of("field", e.getField(), "error", e.getMessage())
@@ -97,7 +110,20 @@ public class AuthController {
registerModeService.getRegisterMode()
);
// 发送确认邮件
userService.sendVerifyMail(user, VerifyType.REGISTER);
try {
userService.sendVerifyMail(user, VerifyType.REGISTER);
} catch (EmailSendException e) {
return ResponseEntity
.status(HttpStatus.INTERNAL_SERVER_ERROR)
.body(
Map.of(
"error",
"邮件发送失败: " + e.getMessage(),
"reason_code",
"EMAIL_SEND_FAILED"
)
);
}
if (!user.isApproved()) {
notificationService.createRegisterRequestNotifications(user, user.getRegisterReason());
}
@@ -169,14 +195,28 @@ public class AuthController {
}
User user = userOpt.get();
if (!user.isVerified()) {
user = userService.register(
user.getUsername(),
user.getEmail(),
user.getPassword(),
user.getRegisterReason(),
registerModeService.getRegisterMode()
);
userService.sendVerifyMail(user, VerifyType.REGISTER);
user =
userService.register(
user.getUsername(),
user.getEmail(),
user.getPassword(),
user.getRegisterReason(),
registerModeService.getRegisterMode()
);
try {
userService.sendVerifyMail(user, VerifyType.REGISTER);
} catch (EmailSendException e) {
return ResponseEntity
.status(HttpStatus.INTERNAL_SERVER_ERROR)
.body(
Map.of(
"error",
"Failed to send verification email: " + e.getMessage(),
"reason_code",
"EMAIL_SEND_FAILED"
)
);
}
return ResponseEntity.badRequest().body(
Map.of(
"error",
@@ -663,7 +703,20 @@ public class AuthController {
if (userOpt.isEmpty()) {
return ResponseEntity.badRequest().body(Map.of("error", "User not found"));
}
userService.sendVerifyMail(userOpt.get(), VerifyType.RESET_PASSWORD);
try {
userService.sendVerifyMail(userOpt.get(), VerifyType.RESET_PASSWORD);
} catch (EmailSendException e) {
return ResponseEntity
.status(HttpStatus.INTERNAL_SERVER_ERROR)
.body(
Map.of(
"error",
"邮件发送失败: " + e.getMessage(),
"reason_code",
"EMAIL_SEND_FAILED"
)
);
}
return ResponseEntity.ok(Map.of("message", "Verification code sent"));
}

View File

@@ -112,7 +112,9 @@ public class CommentController {
)
public List<TimelineItemDto<?>> listComments(
@PathVariable Long postId,
@RequestParam(value = "sort", required = false, defaultValue = "OLDEST") CommentSort sort
@RequestParam(value = "sort", required = false, defaultValue = "OLDEST") CommentSort sort,
@RequestParam(value = "page", required = false, defaultValue = "0") int page,
@RequestParam(value = "pageSize", required = false, defaultValue = "20") int pageSize
) {
log.debug("listComments called for post {} with sort {}", postId, sort);
List<CommentDto> commentDtoList = commentService
@@ -183,8 +185,23 @@ public class CommentController {
createdAtComparator = createdAtComparator.reversed();
}
itemDtoList.sort(comparator.thenComparing(createdAtComparator));
log.debug("listComments returning {} comments", itemDtoList.size());
return itemDtoList;
int safePage = Math.max(0, page);
int safePageSize = Math.max(1, pageSize);
int fromIndex = safePage * safePageSize;
int toIndex = Math.min(fromIndex + safePageSize, itemDtoList.size());
List<TimelineItemDto<?>> pagedItems =
fromIndex >= itemDtoList.size() ? List.of() : itemDtoList.subList(fromIndex, toIndex);
log.debug(
"listComments returning {} items for post {} page {} size {} (total {})",
pagedItems.size(),
postId,
safePage,
safePageSize,
itemDtoList.size()
);
return pagedItems;
}
@GetMapping("/comments/{commentId}/context")

View File

@@ -217,11 +217,7 @@ public class PostController {
// userVisitService.recordVisit(auth.getName());
// }
return postService
.defaultListPosts(ids, tids, page, pageSize)
.stream()
.map(postMapper::toSummaryDto)
.collect(Collectors.toList());
return postMapper.toListDtos(postService.defaultListPosts(ids, tids, page, pageSize));
}
@GetMapping("/recent")
@@ -269,11 +265,7 @@ public class PostController {
// userVisitService.recordVisit(auth.getName());
// }
return postService
.listPostsByViews(ids, tids, page, pageSize)
.stream()
.map(postMapper::toSummaryDto)
.collect(Collectors.toList());
return postMapper.toListDtos(postService.listPostsByViews(ids, tids, page, pageSize));
}
@GetMapping("/latest-reply")
@@ -305,8 +297,7 @@ public class PostController {
// userVisitService.recordVisit(auth.getName());
// }
List<Post> posts = postService.listPostsByLatestReply(ids, tids, page, pageSize);
return posts.stream().map(postMapper::toSummaryDto).collect(Collectors.toList());
return postMapper.toListDtos(postService.listPostsByLatestReply(ids, tids, page, pageSize));
}
@GetMapping("/featured")
@@ -333,10 +324,6 @@ public class PostController {
// if (auth != null) {
// userVisitService.recordVisit(auth.getName());
// }
return postService
.listFeaturedPosts(ids, tids, page, pageSize)
.stream()
.map(postMapper::toSummaryDto)
.collect(Collectors.toList());
return postMapper.toListDtos(postService.listFeaturedPosts(ids, tids, page, pageSize));
}
}

View File

@@ -34,6 +34,7 @@ public class UserController {
private final TagService tagService;
private final SubscriptionService subscriptionService;
private final LevelService levelService;
private final PostReadService postReadService;
private final JwtService jwtService;
private final UserMapper userMapper;
private final TagMapper tagMapper;
@@ -53,6 +54,9 @@ public class UserController {
@Value("${app.user.tags-limit:50}")
private int defaultTagsLimit;
@Value("${app.user.read-posts-limit:50}")
private int defaultReadPostsLimit;
@GetMapping("/me")
@SecurityRequirement(name = "JWT")
@Operation(summary = "Current user", description = "Get current authenticated user information")
@@ -211,6 +215,33 @@ public class UserController {
.collect(java.util.stream.Collectors.toList());
}
@GetMapping("/{identifier}/read-posts")
@SecurityRequirement(name = "JWT")
@Operation(summary = "User read posts", description = "Get post read history (self only)")
@ApiResponse(
responseCode = "200",
description = "Post read history",
content = @Content(array = @ArraySchema(schema = @Schema(implementation = PostReadDto.class)))
)
public ResponseEntity<java.util.List<PostReadDto>> userReadPosts(
@PathVariable("identifier") String identifier,
@RequestParam(value = "limit", required = false) Integer limit,
Authentication auth
) {
User user = userService.findByIdentifier(identifier).orElseThrow();
if (auth == null || !auth.getName().equals(user.getUsername())) {
return ResponseEntity.status(403).body(java.util.List.of());
}
int l = limit != null ? limit : defaultReadPostsLimit;
return ResponseEntity.ok(
postReadService
.getRecentReadsByUser(user.getUsername(), l)
.stream()
.map(userMapper::toPostReadDto)
.collect(java.util.stream.Collectors.toList())
);
}
@GetMapping("/{identifier}/hot-posts")
@Operation(summary = "User hot posts", description = "Get most reacted posts by user")
@ApiResponse(

View File

@@ -13,4 +13,5 @@ public class AuthorDto {
private String username;
private String avatar;
private MedalType displayMedal;
private boolean bot;
}

View File

@@ -0,0 +1,12 @@
package com.openisle.dto;
import java.time.LocalDateTime;
import lombok.Data;
/** DTO for a user's post read record. */
@Data
public class PostReadDto {
private PostMetaDto post;
private LocalDateTime lastReadAt;
}

View File

@@ -28,4 +28,5 @@ public class UserDto {
private int point;
private int currentLevel;
private int nextLevelExp;
private boolean bot;
}

View File

@@ -8,4 +8,5 @@ public class UserSummaryDto {
private Long id;
private String username;
private String avatar;
private boolean bot;
}

View File

@@ -0,0 +1,15 @@
package com.openisle.exception;
/**
* Thrown when email sending fails so callers can surface a clear error upstream.
*/
public class EmailSendException extends RuntimeException {
public EmailSendException(String message) {
super(message);
}
public EmailSendException(String message, Throwable cause) {
super(message, cause);
}
}

View File

@@ -48,6 +48,38 @@ public class PostMapper {
return dto;
}
public List<PostSummaryDto> toListDtos(List<Post> posts) {
if (posts == null || posts.isEmpty()) {
return List.of();
}
Map<Long, List<User>> participantsMap = commentService.getParticipantsForPosts(posts, 5);
return posts
.stream()
.map(post -> {
PostSummaryDto dto = new PostSummaryDto();
applyListFields(post, dto);
List<User> participants = participantsMap.get(post.getId());
if (participants != null) {
dto.setParticipants(
participants.stream().map(userMapper::toAuthorDto).collect(Collectors.toList())
);
} else {
dto.setParticipants(List.of());
}
dto.setReactions(List.of());
return dto;
})
.collect(Collectors.toList());
}
public PostSummaryDto toListDto(Post post) {
PostSummaryDto dto = new PostSummaryDto();
applyListFields(post, dto);
dto.setParticipants(List.of());
dto.setReactions(List.of());
return dto;
}
public PostDetailDto toDetailDto(Post post, String viewer) {
PostDetailDto dto = new PostDetailDto();
applyCommon(post, dto);
@@ -61,6 +93,25 @@ public class PostMapper {
return dto;
}
private void applyListFields(Post post, PostSummaryDto dto) {
dto.setId(post.getId());
dto.setTitle(post.getTitle());
dto.setContent(post.getContent());
dto.setCreatedAt(post.getCreatedAt());
dto.setAuthor(userMapper.toAuthorDto(post.getAuthor()));
dto.setCategory(categoryMapper.toDto(post.getCategory()));
dto.setTags(post.getTags().stream().map(tagMapper::toDto).collect(Collectors.toList()));
dto.setViews(post.getViews());
dto.setCommentCount(post.getCommentCount());
dto.setStatus(post.getStatus());
dto.setPinnedAt(post.getPinnedAt());
dto.setLastReplyAt(post.getLastReplyAt());
dto.setRssExcluded(post.getRssExcluded() == null || post.getRssExcluded());
dto.setClosed(post.isClosed());
dto.setVisibleScope(post.getVisibleScope());
dto.setType(post.getType());
}
private void applyCommon(Post post, PostSummaryDto dto) {
dto.setId(post.getId());
dto.setTitle(post.getTitle());

View File

@@ -3,6 +3,7 @@ package com.openisle.mapper;
import com.openisle.dto.*;
import com.openisle.model.Comment;
import com.openisle.model.Post;
import com.openisle.model.PostRead;
import com.openisle.model.User;
import com.openisle.service.*;
import java.util.stream.Collectors;
@@ -37,6 +38,7 @@ public class UserMapper {
dto.setUsername(user.getUsername());
dto.setAvatar(user.getAvatar());
dto.setDisplayMedal(user.getDisplayMedal());
dto.setBot(user.isBot());
return dto;
}
@@ -63,6 +65,7 @@ public class UserMapper {
dto.setPoint(user.getPoint());
dto.setCurrentLevel(levelService.getLevel(user.getExperience()));
dto.setNextLevelExp(levelService.nextLevelExp(user.getExperience()));
dto.setBot(user.isBot());
if (viewer != null) {
dto.setSubscribed(subscriptionService.isSubscribed(viewer.getName(), user.getUsername()));
} else {
@@ -113,4 +116,11 @@ public class UserMapper {
}
return dto;
}
public PostReadDto toPostReadDto(PostRead read) {
PostReadDto dto = new PostReadDto();
dto.setPost(toMetaDto(read.getPost()));
dto.setLastReadAt(read.getLastReadAt());
return dto;
}
}

View File

@@ -62,6 +62,9 @@ public class User {
@Column(nullable = false)
private Role role = Role.USER;
@Column(name = "is_bot", nullable = false)
private boolean bot = false;
@Enumerated(EnumType.STRING)
private MedalType displayMedal;

View File

@@ -25,6 +25,13 @@ public interface CommentRepository extends JpaRepository<Comment, Long> {
@org.springframework.data.repository.query.Param("post") Post post
);
@org.springframework.data.jpa.repository.Query(
"SELECT DISTINCT c.post.id, c.author FROM Comment c WHERE c.post.id IN :postIds"
)
java.util.List<Object[]> findDistinctAuthorsByPostIds(
@org.springframework.data.repository.query.Param("postIds") java.util.List<Long> postIds
);
@org.springframework.data.jpa.repository.Query(
"SELECT MAX(c.createdAt) FROM Comment c WHERE c.post = :post"
)

View File

@@ -3,11 +3,14 @@ package com.openisle.repository;
import com.openisle.model.Post;
import com.openisle.model.PostRead;
import com.openisle.model.User;
import java.util.List;
import java.util.Optional;
import org.springframework.data.domain.Pageable;
import org.springframework.data.jpa.repository.JpaRepository;
public interface PostReadRepository extends JpaRepository<PostRead, Long> {
Optional<PostRead> findByUserAndPost(User user, Post post);
List<PostRead> findByUserOrderByLastReadAtDesc(User user, Pageable pageable);
long countByUser(User user);
void deleteByPost(Post post);
}

View File

@@ -19,6 +19,8 @@ public interface PostRepository extends JpaRepository<Post, Long> {
List<Post> findByStatusOrderByCreatedAtDesc(PostStatus status, Pageable pageable);
List<Post> findByStatusOrderByViewsDesc(PostStatus status);
List<Post> findByStatusOrderByViewsDesc(PostStatus status, Pageable pageable);
List<Post> findByStatusOrderByPinnedAtDescViewsDesc(PostStatus status, Pageable pageable);
List<Post> findByStatusOrderByPinnedAtDescLastReplyAtDesc(PostStatus status, Pageable pageable);
List<Post> findByStatusAndCreatedAtGreaterThanEqualOrderByCreatedAtDesc(
PostStatus status,
LocalDateTime createdAt
@@ -43,6 +45,16 @@ public interface PostRepository extends JpaRepository<Post, Long> {
PostStatus status,
Pageable pageable
);
List<Post> findByCategoryInAndStatusOrderByPinnedAtDescViewsDesc(
List<Category> categories,
PostStatus status,
Pageable pageable
);
List<Post> findByCategoryInAndStatusOrderByPinnedAtDescLastReplyAtDesc(
List<Category> categories,
PostStatus status,
Pageable pageable
);
List<Post> findDistinctByTagsInAndStatus(List<Tag> tags, PostStatus status);
List<Post> findDistinctByTagsInAndStatus(List<Tag> tags, PostStatus status, Pageable pageable);
List<Post> findDistinctByTagsInAndStatusOrderByCreatedAtDesc(List<Tag> tags, PostStatus status);
@@ -132,6 +144,26 @@ public interface PostRepository extends JpaRepository<Post, Long> {
Pageable pageable
);
@Query(
"SELECT p FROM Post p JOIN p.tags t WHERE t IN :tags AND p.status = :status GROUP BY p.id HAVING COUNT(DISTINCT t.id) = :tagCount ORDER BY p.pinnedAt DESC, p.views DESC"
)
List<Post> findByAllTagsOrderByPinnedAtDescViewsDesc(
@Param("tags") List<Tag> tags,
@Param("status") PostStatus status,
@Param("tagCount") long tagCount,
Pageable pageable
);
@Query(
"SELECT p FROM Post p JOIN p.tags t WHERE t IN :tags AND p.status = :status GROUP BY p.id HAVING COUNT(DISTINCT t.id) = :tagCount ORDER BY p.pinnedAt DESC, p.lastReplyAt DESC"
)
List<Post> findByAllTagsOrderByPinnedAtDescLastReplyAtDesc(
@Param("tags") List<Tag> tags,
@Param("status") PostStatus status,
@Param("tagCount") long tagCount,
Pageable pageable
);
@Query(
"SELECT p FROM Post p JOIN p.tags t WHERE p.category IN :categories AND t IN :tags AND p.status = :status GROUP BY p.id HAVING COUNT(DISTINCT t.id) = :tagCount"
)
@@ -174,6 +206,28 @@ public interface PostRepository extends JpaRepository<Post, Long> {
Pageable pageable
);
@Query(
"SELECT p FROM Post p JOIN p.tags t WHERE p.category IN :categories AND t IN :tags AND p.status = :status GROUP BY p.id HAVING COUNT(DISTINCT t.id) = :tagCount ORDER BY p.pinnedAt DESC, p.views DESC"
)
List<Post> findByCategoriesAndAllTagsOrderByPinnedAtDescViewsDesc(
@Param("categories") List<Category> categories,
@Param("tags") List<Tag> tags,
@Param("status") PostStatus status,
@Param("tagCount") long tagCount,
Pageable pageable
);
@Query(
"SELECT p FROM Post p JOIN p.tags t WHERE p.category IN :categories AND t IN :tags AND p.status = :status GROUP BY p.id HAVING COUNT(DISTINCT t.id) = :tagCount ORDER BY p.pinnedAt DESC, p.lastReplyAt DESC"
)
List<Post> findByCategoriesAndAllTagsOrderByPinnedAtDescLastReplyAtDesc(
@Param("categories") List<Category> categories,
@Param("tags") List<Tag> tags,
@Param("status") PostStatus status,
@Param("tagCount") long tagCount,
Pageable pageable
);
@Query(
"SELECT p FROM Post p JOIN p.tags t WHERE p.category IN :categories AND t IN :tags AND p.status = :status GROUP BY p.id HAVING COUNT(DISTINCT t.id) = :tagCount ORDER BY p.createdAt DESC"
)

View File

@@ -105,6 +105,7 @@ public class ChannelService {
userDto.setId(message.getSender().getId());
userDto.setUsername(message.getSender().getUsername());
userDto.setAvatar(message.getSender().getAvatar());
userDto.setBot(message.getSender().isBot());
dto.setSender(userDto);
return dto;

View File

@@ -21,8 +21,12 @@ import com.openisle.service.NotificationService;
import com.openisle.service.PointService;
import com.openisle.service.SubscriptionService;
import java.time.LocalDateTime;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.HashSet;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.stream.Collectors;
import lombok.RequiredArgsConstructor;
@@ -316,6 +320,37 @@ public class CommentService {
return result;
}
public Map<Long, List<User>> getParticipantsForPosts(List<Post> posts, int limit) {
if (posts == null || posts.isEmpty()) {
return Map.of();
}
Map<Long, LinkedHashSet<User>> map = new HashMap<>();
List<Long> postIds = new ArrayList<>(posts.size());
for (Post post : posts) {
postIds.add(post.getId());
LinkedHashSet<User> set = new LinkedHashSet<>();
set.add(post.getAuthor());
map.put(post.getId(), set);
}
for (Object[] row : commentRepository.findDistinctAuthorsByPostIds(postIds)) {
Long postId = (Long) row[0];
User author = (User) row[1];
LinkedHashSet<User> set = map.get(postId);
if (set != null) {
set.add(author);
}
}
Map<Long, List<User>> result = new HashMap<>(map.size());
for (Map.Entry<Long, LinkedHashSet<User>> entry : map.entrySet()) {
List<User> list = new ArrayList<>(entry.getValue());
if (list.size() > limit) {
list = list.subList(0, limit);
}
result.put(entry.getKey(), list);
}
return result;
}
public java.util.List<Comment> getCommentsByIds(java.util.List<Long> ids) {
log.debug("getCommentsByIds called for ids {}", ids);
java.util.List<Comment> comments = commentRepository.findAllById(ids);

View File

@@ -211,6 +211,7 @@ public class MessageService {
userSummaryDto.setId(message.getSender().getId());
userSummaryDto.setUsername(message.getSender().getUsername());
userSummaryDto.setAvatar(message.getSender().getAvatar());
userSummaryDto.setBot(message.getSender().isBot());
dto.setSender(userSummaryDto);
if (message.getReplyTo() != null) {
@@ -222,6 +223,7 @@ public class MessageService {
replySender.setId(reply.getSender().getId());
replySender.setUsername(reply.getSender().getUsername());
replySender.setAvatar(reply.getSender().getAvatar());
replySender.setBot(reply.getSender().isBot());
replyDto.setSender(replySender);
dto.setReplyTo(replyDto);
}
@@ -316,6 +318,7 @@ public class MessageService {
userDto.setId(p.getUser().getId());
userDto.setUsername(p.getUser().getUsername());
userDto.setAvatar(p.getUser().getAvatar());
userDto.setBot(p.getUser().isBot());
return userDto;
})
.collect(Collectors.toList())
@@ -365,6 +368,7 @@ public class MessageService {
userDto.setId(p.getUser().getId());
userDto.setUsername(p.getUser().getUsername());
userDto.setAvatar(p.getUser().getAvatar());
userDto.setBot(p.getUser().isBot());
return userDto;
})
.collect(Collectors.toList());

View File

@@ -7,6 +7,7 @@ import com.openisle.repository.NotificationRepository;
import com.openisle.repository.ReactionRepository;
import com.openisle.repository.UserRepository;
import com.openisle.service.EmailSender;
import com.openisle.exception.EmailSendException;
import java.util.ArrayList;
import java.util.EnumSet;
import java.util.HashSet;
@@ -17,6 +18,7 @@ import java.util.concurrent.Executor;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
@@ -26,6 +28,7 @@ import org.springframework.transaction.support.TransactionSynchronizationManager
/** Service for creating and retrieving notifications. */
@Service
@RequiredArgsConstructor
@Slf4j
public class NotificationService {
private final NotificationRepository notificationRepository;
@@ -108,7 +111,11 @@ public class NotificationService {
post.getId(),
comment.getId()
);
emailSender.sendEmail(user.getEmail(), "有人回复了你", url);
try {
emailSender.sendEmail(user.getEmail(), "有人回复了你", url);
} catch (EmailSendException e) {
log.warn("Failed to send notification email to {}: {}", user.getEmail(), e.getMessage());
}
sendCustomPush(user, "有人回复了你", url);
} else if (type == NotificationType.REACTION && comment != null) {
// long count = reactionRepository.countReceived(comment.getAuthor().getUsername());

View File

@@ -7,7 +7,10 @@ import com.openisle.repository.PostReadRepository;
import com.openisle.repository.PostRepository;
import com.openisle.repository.UserRepository;
import java.time.LocalDateTime;
import java.util.List;
import lombok.RequiredArgsConstructor;
import org.springframework.data.domain.PageRequest;
import org.springframework.data.domain.Pageable;
import org.springframework.stereotype.Service;
@Service
@@ -43,6 +46,14 @@ public class PostReadService {
);
}
public List<PostRead> getRecentReadsByUser(String username, int limit) {
User user = userRepository
.findByUsername(username)
.orElseThrow(() -> new com.openisle.exception.NotFoundException("User not found"));
Pageable pageable = PageRequest.of(0, limit);
return postReadRepository.findByUserOrderByLastReadAtDesc(user, pageable);
}
public long countReads(String username) {
User user = userRepository
.findByUsername(username)

View File

@@ -19,6 +19,7 @@ import com.openisle.repository.TagRepository;
import com.openisle.repository.UserRepository;
import com.openisle.search.SearchIndexEventPublisher;
import com.openisle.service.EmailSender;
import com.openisle.exception.EmailSendException;
import java.time.Duration;
import java.time.LocalDateTime;
import java.time.ZoneId;
@@ -338,6 +339,7 @@ public class PostService {
post.setCategory(category);
post.setTags(new HashSet<>(tags));
post.setStatus(publishMode == PublishMode.REVIEW ? PostStatus.PENDING : PostStatus.PUBLISHED);
post.setLastReplyAt(LocalDateTime.now());
// 什么都没设置的情况下默认为ALL
if (Objects.isNull(postVisibleScopeType)) {
@@ -663,11 +665,15 @@ public class PostService {
w.getEmail() != null &&
!w.getDisabledEmailNotificationTypes().contains(NotificationType.LOTTERY_WIN)
) {
emailSender.sendEmail(
w.getEmail(),
"你中奖了",
"恭喜你在抽奖贴 \"" + lp.getTitle() + "\" 中获奖"
);
try {
emailSender.sendEmail(
w.getEmail(),
"你中奖了",
"恭喜你在抽奖贴 \"" + lp.getTitle() + "\" 中获奖"
);
} catch (EmailSendException e) {
log.warn("Failed to send lottery win email to {}: {}", w.getEmail(), e.getMessage());
}
}
notificationService.createNotification(
w,
@@ -693,11 +699,19 @@ public class PostService {
.getDisabledEmailNotificationTypes()
.contains(NotificationType.LOTTERY_DRAW)
) {
emailSender.sendEmail(
lp.getAuthor().getEmail(),
"抽奖已开奖",
"您的抽奖贴 \"" + lp.getTitle() + "\" 已开奖"
);
try {
emailSender.sendEmail(
lp.getAuthor().getEmail(),
"抽奖已开奖",
"您的抽奖贴 \"" + lp.getTitle() + "\" 已开奖"
);
} catch (EmailSendException e) {
log.warn(
"Failed to send lottery draw email to {}: {}",
lp.getAuthor().getEmail(),
e.getMessage()
);
}
}
notificationService.createNotification(
lp.getAuthor(),
@@ -796,9 +810,10 @@ public class PostService {
boolean hasTags = tagIds != null && !tagIds.isEmpty();
java.util.List<Post> posts;
Pageable pageable = buildPageable(page, pageSize);
if (!hasCategories && !hasTags) {
posts = postRepository.findByStatusOrderByViewsDesc(PostStatus.PUBLISHED);
posts = postRepository.findByStatusOrderByPinnedAtDescViewsDesc(PostStatus.PUBLISHED, pageable);
} else if (hasCategories) {
java.util.List<Category> categories = categoryRepository.findAllById(categoryIds);
if (categories.isEmpty()) {
@@ -809,16 +824,18 @@ public class PostService {
if (tags.isEmpty()) {
return java.util.List.of();
}
posts = postRepository.findByCategoriesAndAllTagsOrderByViewsDesc(
posts = postRepository.findByCategoriesAndAllTagsOrderByPinnedAtDescViewsDesc(
categories,
tags,
PostStatus.PUBLISHED,
tags.size()
tags.size(),
pageable
);
} else {
posts = postRepository.findByCategoryInAndStatusOrderByViewsDesc(
posts = postRepository.findByCategoryInAndStatusOrderByPinnedAtDescViewsDesc(
categories,
PostStatus.PUBLISHED
PostStatus.PUBLISHED,
pageable
);
}
} else {
@@ -826,10 +843,15 @@ public class PostService {
if (tags.isEmpty()) {
return java.util.List.of();
}
posts = postRepository.findByAllTagsOrderByViewsDesc(tags, PostStatus.PUBLISHED, tags.size());
posts = postRepository.findByAllTagsOrderByPinnedAtDescViewsDesc(
tags,
PostStatus.PUBLISHED,
tags.size(),
pageable
);
}
return paginate(sortByPinnedAndViews(posts), page, pageSize);
return posts;
}
public List<Post> listPostsByLatestReply(Integer page, Integer pageSize) {
@@ -846,9 +868,13 @@ public class PostService {
boolean hasTags = tagIds != null && !tagIds.isEmpty();
java.util.List<Post> posts;
Pageable pageable = buildPageable(page, pageSize);
if (!hasCategories && !hasTags) {
posts = postRepository.findByStatusOrderByCreatedAtDesc(PostStatus.PUBLISHED);
posts = postRepository.findByStatusOrderByPinnedAtDescLastReplyAtDesc(
PostStatus.PUBLISHED,
pageable
);
} else if (hasCategories) {
java.util.List<Category> categories = categoryRepository.findAllById(categoryIds);
if (categories.isEmpty()) {
@@ -859,16 +885,18 @@ public class PostService {
if (tags.isEmpty()) {
return java.util.List.of();
}
posts = postRepository.findByCategoriesAndAllTagsOrderByCreatedAtDesc(
posts = postRepository.findByCategoriesAndAllTagsOrderByPinnedAtDescLastReplyAtDesc(
categories,
tags,
PostStatus.PUBLISHED,
tags.size()
tags.size(),
pageable
);
} else {
posts = postRepository.findByCategoryInAndStatusOrderByCreatedAtDesc(
posts = postRepository.findByCategoryInAndStatusOrderByPinnedAtDescLastReplyAtDesc(
categories,
PostStatus.PUBLISHED
PostStatus.PUBLISHED,
pageable
);
}
} else {
@@ -876,14 +904,15 @@ public class PostService {
if (tags.isEmpty()) {
return new ArrayList<>();
}
posts = postRepository.findByAllTagsOrderByCreatedAtDesc(
posts = postRepository.findByAllTagsOrderByPinnedAtDescLastReplyAtDesc(
tags,
PostStatus.PUBLISHED,
tags.size()
tags.size(),
pageable
);
}
return paginate(sortByPinnedAndLastReply(posts), page, pageSize);
return posts;
}
public List<Post> listPostsByCategories(
@@ -1381,6 +1410,13 @@ public class PostService {
.toList();
}
private Pageable buildPageable(Integer page, Integer pageSize) {
if (page == null || pageSize == null) {
return Pageable.unpaged();
}
return PageRequest.of(page, pageSize);
}
private List<Post> paginate(List<Post> posts, Integer page, Integer pageSize) {
if (page == null || pageSize == null) {
return posts;

View File

@@ -1,5 +1,6 @@
package com.openisle.service;
import com.openisle.exception.EmailSendException;
import java.util.HashMap;
import java.util.Map;
import org.springframework.beans.factory.annotation.Value;
@@ -7,8 +8,9 @@ import org.springframework.http.HttpEntity;
import org.springframework.http.HttpHeaders;
import org.springframework.http.HttpMethod;
import org.springframework.http.MediaType;
import org.springframework.scheduling.annotation.Async;
import org.springframework.http.ResponseEntity;
import org.springframework.stereotype.Service;
import org.springframework.web.client.RestClientException;
import org.springframework.web.client.RestTemplate;
@Service
@@ -23,7 +25,6 @@ public class ResendEmailSender extends EmailSender {
private final RestTemplate restTemplate = new RestTemplate();
@Override
@Async("notificationExecutor")
public void sendEmail(String to, String subject, String text) {
String url = "https://api.resend.com/emails"; // hypothetical endpoint
@@ -38,6 +39,20 @@ public class ResendEmailSender extends EmailSender {
body.put("from", "openisle <" + fromEmail + ">");
HttpEntity<Map<String, String>> entity = new HttpEntity<>(body, headers);
restTemplate.exchange(url, HttpMethod.POST, entity, String.class);
try {
ResponseEntity<String> response = restTemplate.exchange(
url,
HttpMethod.POST,
entity,
String.class
);
if (!response.getStatusCode().is2xxSuccessful()) {
throw new EmailSendException(
"Email service returned status " + response.getStatusCodeValue()
);
}
} catch (RestClientException e) {
throw new EmailSendException("Failed to send email: " + e.getMessage(), e);
}
}
}

View File

@@ -118,7 +118,6 @@ public class UserService {
* @param user
*/
public void sendVerifyMail(User user, VerifyType verifyType) {
// 缓存验证码
String code = genCode();
String key;
String subject;
@@ -133,8 +132,9 @@ public class UserService {
subject = "请填写验证码以重置密码(有效期为5分钟)";
}
redisTemplate.opsForValue().set(key, code, 5, TimeUnit.MINUTES); // 五分钟后验证码过期
emailService.sendEmail(user.getEmail(), subject, content);
// 邮件发送成功后再缓存验证码,避免发送失败时用户收不到但验证被要求
redisTemplate.opsForValue().set(key, code, 5, TimeUnit.MINUTES); // 五分钟后验证码过期
}
/**

View File

@@ -20,6 +20,7 @@ CREATE TABLE IF NOT EXISTS `users` (
`username` varchar(50) NOT NULL,
`verification_code` varchar(255) DEFAULT NULL,
`verified` bit(1) DEFAULT NULL,
`is_bot` bit(1) NOT NULL DEFAULT b'0',
PRIMARY KEY (`id`),
UNIQUE KEY `UK_users_email` (`email`),
UNIQUE KEY `UK_users_username` (`username`)

View File

@@ -8,10 +8,28 @@ DELETE FROM `users`;
-- 插入用户,两个普通用户,一个管理员
-- username:admin/user1/user2 password:123456
INSERT INTO `users` (`id`, `approved`, `avatar`, `created_at`, `display_medal`, `email`, `experience`, `introduction`, `password`, `password_reset_code`, `point`, `register_reason`, `role`, `username`, `verification_code`, `verified`) VALUES
(1, b'1', 'https://openisle-1307107697.cos.ap-guangzhou.myqcloud.com/assert/image.png', '2025-09-01 16:08:17.426430', 'PIONEER', 'adminmail@openisle.com', 70, NULL, '$2a$10$x7HXjUyJTmrvqjnBlBQZH.vmfsC56NzTSWqQ6WqZqRjUO859EhviS', NULL, 110, '测试测试测试……', 'ADMIN', 'admin', NULL, b'1'),
(2, b'1', 'https://openisle-1307107697.cos.ap-guangzhou.myqcloud.com/assert/image.png', '2025-09-03 16:08:17.426430', 'PIONEER', 'usermail2@openisle.com', 70, NULL, '$2a$10$x7HXjUyJTmrvqjnBlBQZH.vmfsC56NzTSWqQ6WqZqRjUO859EhviS', NULL, 110, '测试测试测试……', 'USER', 'user1', NULL, b'1'),
(3, b'1', 'https://openisle-1307107697.cos.ap-guangzhou.myqcloud.com/assert/image.png', '2025-09-02 17:21:21.617666', 'PIONEER', 'usermail1@openisle.com', 40, NULL, '$2a$10$x7HXjUyJTmrvqjnBlBQZH.vmfsC56NzTSWqQ6WqZqRjUO859EhviS', NULL, 40, '测试测试测试……', 'USER', 'user2', NULL, b'1');
INSERT INTO `users` (
`id`,
`approved`,
`avatar`,
`created_at`,
`display_medal`,
`email`,
`experience`,
`introduction`,
`password`,
`password_reset_code`,
`point`,
`register_reason`,
`role`,
`username`,
`verification_code`,
`verified`,
`is_bot`
) VALUES
(1, b'1', 'https://openisle-1307107697.cos.ap-guangzhou.myqcloud.com/assert/image.png', '2025-09-01 16:08:17.426430', 'PIONEER', 'adminmail@openisle.com', 70, NULL, '$2a$10$x7HXjUyJTmrvqjnBlBQZH.vmfsC56NzTSWqQ6WqZqRjUO859EhviS', NULL, 110, '测试测试测试……', 'ADMIN', 'admin', NULL, b'1', b'0'),
(2, b'1', 'https://openisle-1307107697.cos.ap-guangzhou.myqcloud.com/assert/image.png', '2025-09-03 16:08:17.426430', 'PIONEER', 'usermail2@openisle.com', 70, NULL, '$2a$10$x7HXjUyJTmrvqjnBlBQZH.vmfsC56NzTSWqQ6WqZqRjUO859EhviS', NULL, 110, '测试测试测试……', 'USER', 'user1', NULL, b'1', b'0'),
(3, b'1', 'https://openisle-1307107697.cos.ap-guangzhou.myqcloud.com/assert/image.png', '2025-09-02 17:21:21.617666', 'PIONEER', 'usermail1@openisle.com', 40, NULL, '$2a$10$x7HXjUyJTmrvqjnBlBQZH.vmfsC56NzTSWqQ6WqZqRjUO859EhviS', NULL, 40, '测试测试测试……', 'USER', 'user2', NULL, b'1', b'0');
INSERT INTO `categories` (`id`,`description`,`icon`,`name`,`small_icon`) VALUES
(1,'测试用分类1','star','测试用分类1',NULL),

View File

@@ -0,0 +1,4 @@
-- Backfill last_reply_at for posts without comments to preserve latest-reply ordering
UPDATE posts
SET last_reply_at = created_at
WHERE last_reply_at IS NULL;

View File

@@ -0,0 +1,2 @@
ALTER TABLE users
ADD COLUMN is_bot BIT(1) NOT NULL DEFAULT b'0';

View File

@@ -0,0 +1,12 @@
CREATE TABLE IF NOT EXISTS post_reads (
id BIGINT NOT NULL AUTO_INCREMENT,
user_id BIGINT NOT NULL,
post_id BIGINT NOT NULL,
last_read_at DATETIME(6) DEFAULT NULL,
PRIMARY KEY (id),
UNIQUE KEY UK_post_reads_user_post (user_id, post_id),
KEY IDX_post_reads_user (user_id),
KEY IDX_post_reads_post (post_id),
CONSTRAINT FK_post_reads_user FOREIGN KEY (user_id) REFERENCES users (id) ON DELETE CASCADE,
CONSTRAINT FK_post_reads_post FOREIGN KEY (post_id) REFERENCES posts (id) ON DELETE CASCADE
);

68
bots/AGENTS.md Normal file
View File

@@ -0,0 +1,68 @@
# Bots 协作指引
## 1) 适用范围
- 作用于 `bots/` 目录及其子目录。
- 本文件用于统一 Bot 脚本开发、调度与发布规范。
## 2) 模块结构与职责
- `bot_father.ts`Bot 基类,统一 Agent 初始化、MCP 工具接入、CLI 运行入口。
- `instance/reply_bot.ts`:常规互动回复 Bot提及/评论自动处理)。
- `instance/open_source_reply_bot.ts`:开源问答 Bot偏代码与贡献流程
- `instance/daily_news_bot.ts`:每日新闻帖 Bot。
- `instance/coffee_bot.ts`:早安抽奖帖 Bot。
## 3) 开发约定(新增/改造 Bot
- 新 Bot 统一继承 `BotFather`,最少实现:
- `getAdditionalInstructions()`
- `getCliQuery()`
- 保持导出约定:`export const runWorkflow = ...`,并保留 `if (require.main === module)` CLI 入口。
- 不随意改动 `bot_father.ts` 的 MCP 工具白名单;若必须调整,需同步评估 `mcp/` 契约与线上可用性。
## 4) 环境变量与密钥规范
- 必需:`OPENAI_API_KEY`(缺失会直接失败)。
- 常用:
- `OPENISLE_TOKEN`(用于 OpenIsle MCP 鉴权GitHub Actions 中可映射不同 secret
- `APIFY_API_TOKEN`(天气 MCP 鉴权)
- News Bot 可选参数:
- `DAILY_NEWS_CATEGORY_ID`
- `DAILY_NEWS_TAG_IDS`
- 严禁在代码中硬编码真实 token仅通过 CI secrets 或本地环境变量注入。
## 5) 工作流同步规则(与 GitHub Actions 对齐)
- 相关工作流:
- `.github/workflows/reply-bots.yml`
- `.github/workflows/open_source_reply_bot.yml`
- `.github/workflows/news-bot.yml`
- `.github/workflows/coffee-bot.yml`
- 若改脚本入口、依赖或 env 键名,必须同步更新对应 workflow。
- 若改触发节奏cron或 Bot 行为边界,需在变更说明中写明影响(频率、成本、风险)。
## 6) 行为约束(防重复/防失控)
- 回复类 Bot 需保持幂等:避免对同一上下文重复回复。
- 处理未读后应调用 `mark_notifications_read` 清理通知。
- 批量处理建议保持上限(当前提示词约定为最多 10 条)。
- 发帖类 Botnews/coffee必须控制 `create_post` 调用次数(一次任务最多一次发帖)。
- Open Source Reply Bot 保持专业技术风格,避免跑题到非开源问答。
## 7) 本地验证建议
- 依赖安装(与 CI 一致):
- `npm install --no-save @openai/agents tsx typescript`
- 单脚本运行示例:
- `npx tsx bots/instance/reply_bot.ts`
- `npx tsx bots/instance/open_source_reply_bot.ts`
- `npx tsx bots/instance/daily_news_bot.ts`
- `npx tsx bots/instance/coffee_bot.ts`
- 验证时至少确认:可启动、可调用 MCP、异常时退出码非 0。
## 8) 输出要求
- 说明改动影响哪个 Bot、哪个 workflow。
- 说明是否改变了工具调用边界MCP tools / 发帖次数 / 回复策略)。
- 说明是否需要同步更新文档或运维配置。

186
bots/bot_father.ts Normal file
View File

@@ -0,0 +1,186 @@
import { Agent, Runner, hostedMcpTool, withTrace, webSearchTool } from "@openai/agents";
export type WorkflowInput = { input_as_text: string };
export abstract class BotFather {
protected readonly openisleToken = (process.env.OPENISLE_TOKEN ?? "").trim();
protected readonly weatherToken = (process.env.APIFY_API_TOKEN ?? "").trim();
protected readonly openisleMcp = this.createHostedMcpTool();
protected readonly weatherMcp = this.createWeatherMcpTool();
protected readonly webSearchPreview = this.createWebSearchPreviewTool();
protected readonly agent: Agent;
constructor(protected readonly name: string) {
console.log(`${this.name} starting...`);
console.log(
this.openisleToken
? "🔑 OPENISLE_TOKEN detected in environment; it will be attached to MCP requests."
: "🔓 OPENISLE_TOKEN not set; authenticated MCP tools may be unavailable."
);
console.log(
this.weatherToken
? "☁️ APIFY_API_TOKEN detected; weather MCP server will be available."
: "🌥️ APIFY_API_TOKEN not set; weather updates will be unavailable."
);
this.agent = new Agent({
name: this.name,
instructions: this.buildInstructions(),
tools: [
this.openisleMcp,
this.weatherMcp,
this.webSearchPreview
],
model: this.getModel(),
modelSettings: {
temperature: 0.7,
topP: 1,
maxTokens: 2048,
toolChoice: "auto",
store: true,
},
});
}
protected buildInstructions(): string {
const instructions = [
...this.getBaseInstructions(),
...this.getAdditionalInstructions(),
].filter(Boolean);
return instructions.join("\n");
}
protected getBaseInstructions(): string[] {
return [
"You are a helpful assistant for https://www.open-isle.com.",
"Finish tasks end-to-end before replying. If multiple MCP tools are needed, call them sequentially until the task is truly done.",
"When presenting the result, reply in Chinese with a concise summary and include any important URLs or IDs.",
"After finishing replies, call mark_notifications_read with all processed notification IDs to keep the inbox clean.",
];
}
private createWebSearchPreviewTool() {
return webSearchTool({
userLocation: {
type: "approximate",
country: undefined,
region: undefined,
city: undefined,
timezone: undefined
},
searchContextSize: "medium"
})
}
private createHostedMcpTool() {
const token = this.openisleToken;
const authConfig = token
? {
headers: {
Authorization: `Bearer ${token}`,
},
}
: {};
return hostedMcpTool({
serverLabel: "openisle_mcp",
serverUrl: "https://www.open-isle.com/mcp",
allowedTools: [
"search", // 用于搜索帖子、内容等
"create_post", // 创建新帖子
"reply_to_post", // 回复帖子
"reply_to_comment", // 回复评论
"recent_posts", // 获取最新帖子
"get_post", // 获取特定帖子的详细信息
"list_unread_messages", // 列出未读消息或通知
"mark_notifications_read", // 标记通知为已读
],
requireApproval: "never",
...authConfig,
});
}
private createWeatherMcpTool(): ReturnType<typeof hostedMcpTool> {
return hostedMcpTool({
serverLabel: "weather_mcp_server",
serverUrl: "https://jiri-spilka--weather-mcp-server.apify.actor/mcp",
requireApproval: "never",
allowedTools: [
"get_current_weather", // 天气 MCP 工具
],
headers: {
Authorization: `Bearer ${this.weatherToken || ""}`,
},
});
}
protected getAdditionalInstructions(): string[] {
return [];
}
protected getModel(): string {
return "gpt-4o-mini";
}
protected createRunner(): Runner {
return new Runner({
workflowName: this.name,
traceMetadata: {
__trace_source__: "agent-builder",
workflow_id: "wf_69003cbd47e08190928745d3c806c0b50d1a01cfae052be8",
},
});
}
public async runWorkflow(workflow: WorkflowInput) {
if (!process.env.OPENAI_API_KEY) {
throw new Error("Missing OPENAI_API_KEY");
}
const runner = this.createRunner();
return await withTrace(`${this.name} run`, async () => {
const preview = workflow.input_as_text.trim();
console.log(
"📝 Received workflow input (preview):",
preview.length > 200 ? `${preview.slice(0, 200)}` : preview
);
console.log("🚦 Starting agent run with maxTurns=16...");
const result = await runner.run(this.agent, workflow.input_as_text, {
maxTurns: 16,
});
console.log("📬 Agent run completed. Result keys:", Object.keys(result));
if (!result.finalOutput) {
throw new Error("Agent result is undefined (no final output).");
}
const openisleBotResult = { output_text: String(result.finalOutput) };
console.log(
"🤖 Agent result (length=%d):\n%s",
openisleBotResult.output_text.length,
openisleBotResult.output_text
);
return openisleBotResult;
});
}
protected abstract getCliQuery(): string;
public async runCli(): Promise<void> {
try {
const query = this.getCliQuery();
console.log("🔍 Running workflow...");
await this.runWorkflow({ input_as_text: query });
process.exit(0);
} catch (err: any) {
console.error("❌ Agent failed:", err?.stack || err);
process.exit(1);
}
}
}

View File

@@ -0,0 +1,56 @@
import { BotFather, WorkflowInput } from "../bot_father";
const WEEKDAY_NAMES = ["日", "一", "二", "三", "四", "五", "六"] as const;
class CoffeeBot extends BotFather {
constructor() {
super("Coffee Bot");
}
protected override getAdditionalInstructions(): string[] {
return [
"记住你的系统代号是 system有需要自称或签名时都要使用这个名字。",
"You are responsible for 发布每日抽奖早安贴。",
"创建帖子时,确保标题、奖品信息、开奖时间以及领奖方式完全符合 CLI 查询提供的细节。",
"正文需亲切友好,简洁明了,鼓励社区成员互动。",
"开奖说明需明确告知中奖者需私聊站长 @nagisa 领取奖励。",
"确保只发布一个帖子,避免重复调用 create_post。",
"使用标签为 weather_mcp_server 的 MCP 工具获取北京、上海、广州、深圳当天的天气信息,并把结果写入早安问候之后。",
];
}
protected override getCliQuery(): string {
const now = new Date(Date.now() + 8 * 60 * 60 * 1000);
const weekday = WEEKDAY_NAMES[now.getDay()];
const drawTime = new Date(now);
drawTime.setHours(15, 0, 0, 0);
return `
请立即在 https://www.open-isle.com 使用 create_post 发表一篇帖子,遵循以下要求:
1. 标题固定为「大家星期${weekday}早安--抽一杯咖啡」。
2. 正文包含:
- 亲切的早安问候;
- 早安问候后立即列出北京、上海、广州、深圳当天的天气信息,每行格式为“城市:天气描述,最低温~最高温”;天气需调用 weather_mcp_server 获取;
- 标注“领奖请私聊站长 @[nagisa]”;
- 鼓励大家留言互动。
3. 奖品信息
- 明确奖品写作“Coffee”
- 帖子类型必须为 LOTTERY
- 奖品图片链接https://openisle-1307107697.cos.accelerate.myqcloud.com/dynamic_assert/0d6a9b33e9ca4fe5a90540187d3f9ecb.png
- 公布开奖时间为 ${drawTime}, 直接传UTC时间给接口不要考虑时区问题
- categoryId 固定为 10tagIds 设为 [36]。
4. 帖子语言使用简体中文。
`.trim();
}
}
const coffeeBot = new CoffeeBot();
export const runWorkflow = async (workflow: WorkflowInput) => {
return coffeeBot.runWorkflow(workflow);
};
if (require.main === module) {
coffeeBot.runCli();
}

View File

@@ -0,0 +1,69 @@
import { BotFather, WorkflowInput } from "../bot_father";
const WEEKDAY_NAMES = ["日", "一", "二", "三", "四", "五", "六"] as const;
class DailyNewsBot extends BotFather {
constructor() {
super("Daily News Bot");
}
protected override getModel(): string {
return "gpt-4o";
}
protected override getAdditionalInstructions(): string[] {
return [
"You are DailyNewsBot专职在 OpenIsle 发布每日新闻速递。",
"始终使用简体中文回复,并以结构化 Markdown 呈现内容。",
"发布内容前务必完成资讯核实:分别通过 web_search 调研 CoinDesk 所有要闻、Reuters 重点国际新闻,以及全球 AI 领域的重大进展。",
"整合新闻时,将同源资讯合并,突出影响力、涉及主体与潜在影响,保持语句简洁。",
"所有新闻要点都要附带来源链接,并在括号中标注来源站点名。",
"使用 weather_mcp_server 的 get_current_weather 获取北京、上海、广州、深圳的天气,并在正文中列表展示",
"正文结尾补充一个行动建议或提醒,帮助读者快速把握重点。",
"严禁发布超过一篇帖子create_post 只调用一次。",
];
}
protected override getCliQuery(): string {
const now = new Date(Date.now() + 8 * 60 * 60 * 1000);
const year = now.getFullYear();
const month = String(now.getMonth() + 1).padStart(2, "0");
const day = String(now.getDate()).padStart(2, "0");
const weekday = WEEKDAY_NAMES[now.getDay()];
const dateLabel = `${year}${month}${day}日 星期${weekday}`;
const isoDate = `${year}-${month}-${day}`;
const categoryId = Number(process.env.DAILY_NEWS_CATEGORY_ID ?? "6");
const tagIdsEnv = process.env.DAILY_NEWS_TAG_IDS ?? "3,33";
const tagIds = tagIdsEnv
.split(",")
.map((id) => Number(id.trim()))
.filter((id) => !Number.isNaN(id));
const finalTagIds = tagIds.length > 0 ? tagIds : [1];
const tagIdsText = `[${finalTagIds.join(", ")}]`;
return `
请立即在 https://www.open-isle.com 使用 create_post 发布一篇名为「OpenIsle 每日新闻速递|${dateLabel}」的帖子,并遵循以下要求:
1. 发布类型为 NORMALcategoryId = ${categoryId}tagIds = ${tagIdsText}
2. 正文以简洁问候开头, 不用再重复标题
3. 使用 web_search 工具按以下顺序收集资讯,并在正文中以 Markdown 小节呈现, 需要调用3次web_search
- 「全球区块链与加密」:汇总 coindesk.com 版面所有重点新闻, 列出至少5条
- 「国际新闻速览」:汇总 reuters.com 版面重点头条关注宏观经济、市场波动或政策变化。列出至少5条
- 「AI 行业快讯」:检索今天全球 AI 领域的重要发布或事件(例如 OpenAI、Google、Meta、国内大模型厂商等。列出至少5条
4. 每条新闻采用项目符号,先写结论再给出关键数字或细节,末尾添加来源超链接,格式示例:「**结论** —— 关键细节。(来源:[Reuters](URL))」
5. 资讯整理完毕后,调用 weather_mcp_server.get_current_weather列出北京、上海、广州、深圳今日天气放置在「城市天气」小节下, 本小节可加emoji。
6. 最后一节为「今日提醒」,给出 2-3 条与新闻或天气相关的行动建议。
7. 若在资讯搜集过程中发现相互矛盾的信息,须在正文中以「⚠️ 风险提示」说明原因及尚待确认的点。
9. 发布完成后,不要再次调用 create_post。
`.trim();
}
}
const dailyNewsBot = new DailyNewsBot();
export const runWorkflow = async (workflow: WorkflowInput) => {
return dailyNewsBot.runWorkflow(workflow);
};
if (require.main === module) {
dailyNewsBot.runCli();
}

View File

@@ -0,0 +1,65 @@
import { readFileSync } from "node:fs";
import path from "node:path";
import { BotFather, WorkflowInput } from "../bot_father";
class OpenSourceReplyBot extends BotFather {
constructor() {
super("OpenSource Reply Bot");
}
protected override getAdditionalInstructions(): string[] {
const knowledgeBase = this.loadKnowledgeBase();
return [
"You are OpenSourceReplyBot, a professional helper who focuses on answering open-source development and code-related questions for the OpenIsle community.",
"Respond in Chinese using well-structured Markdown sections such as 标题、列表、代码块等,让回复清晰易读。",
"保持语气专业、耐心、详尽,绝不使用表情符号或颜文字,也不要卖萌。",
"优先解答与项目代码、贡献流程、架构设计或排错相关的问题;",
"在需要时引用 README.md 与 CONTRIBUTING.md 中的要点,帮助用户快速定位文档位置。",
knowledgeBase,
].filter(Boolean);
}
protected override getCliQuery(): string {
return `
【AUTO】每30分钟自动巡检未读提及与评论严格遵守以下流程
1调用 list_unread_messages 获取待处理的“提及/评论”;
2按时间从新到旧逐条处理最多10条如需上下文请调用 get_post
3仅对与开源项目、代码实现或贡献流程直接相关的问题生成详尽的 Markdown 中文回复,
若与主题无关则礼貌说明并跳过;
4回复时引用 README 或 CONTRIBUTING 中的要点(如适用),并优先给出可执行的排查步骤或代码建议;
5回复评论使用 reply_to_comment回复帖子使用 reply_to_post
6若某通知最后一条已由本 bot 回复,则跳过避免重复;
7整理已处理通知 ID 调用 mark_notifications_read
8结束时输出包含处理条目概览URL或ID的总结。`.trim();
}
private loadKnowledgeBase(): string {
const docs = ["../../README.md", "../../CONTRIBUTING.md"];
const sections: string[] = [];
for (const relativePath of docs) {
try {
const absolutePath = path.resolve(__dirname, relativePath);
const content = readFileSync(absolutePath, "utf-8").trim();
if (content) {
sections.push(`以下是 ${path.basename(absolutePath)} 的内容:\n${content}`);
}
} catch (error) {
sections.push(`未能加载 ${relativePath},请检查文件路径或权限。`);
}
}
return sections.join("\n\n");
}
}
const openSourceReplyBot = new OpenSourceReplyBot();
export const runWorkflow = async (workflow: WorkflowInput) => {
return openSourceReplyBot.runWorkflow(workflow);
};
if (require.main === module) {
openSourceReplyBot.runCli();
}

View File

@@ -0,0 +1,38 @@
// reply_bot.ts
import { BotFather, WorkflowInput } from "../bot_father";
class ReplyBot extends BotFather {
constructor() {
super("OpenIsle Bot");
}
protected override getAdditionalInstructions(): string[] {
return [
"记住你的系统代号是 system任何需要自称、署名或解释身份的时候都使用这个名字。",
"以阴阳怪气的方式回复各种互动",
"你每天会发布咖啡抽奖贴,跟大家互动",
];
}
protected override getCliQuery(): string {
return `
【AUTO】无需确认自动处理所有未读的提及与评论
1调用 list_unread_messages
2依次处理每条“提及/评论”:如需上下文则使用 get_post 获取,生成简明中文回复;如有 commentId 则用 reply_to_comment否则用 reply_to_post
3跳过关注和系统事件
4保证幂等性如该贴最后一条是你自己发的回复则跳过
5调用 mark_notifications_read传入本次已处理的通知 ID 清理已读;
6最多只处理最新10条结束时仅输出简要摘要包含URL或ID
`.trim();
}
}
const replyBot = new ReplyBot();
export const runWorkflow = async (workflow: WorkflowInput) => {
return replyBot.runWorkflow(workflow);
};
if (require.main === module) {
replyBot.runCli();
}

40
deploy/AGENTS.md Normal file
View File

@@ -0,0 +1,40 @@
# Deploy 协作指引
## 1) 适用范围
- 作用于 `deploy/` 目录及其脚本。
- 该目录为高风险变更区,默认保守修改。
## 2) 当前部署基线
- 预发:`main` push 触发(见 `.github/workflows/deploy-staging.yml`)。
- 正式:定时任务触发(见 `.github/workflows/deploy.yml`)。
- 两者使用同一并发锁 `openisle-server`,避免服务器并发部署冲突。
## 3) 脚本修改原则
- 保留 `set -euo pipefail` 等安全执行特性。
- 变更服务列表或 `docker compose up` 参数时,必须说明影响范围。
- 不随意改动 `git fetch/checkout/reset` 逻辑;若改,必须附回滚方案。
- 任何“可能中断服务”的改动,都要在说明中给出停机/风险评估。
## 4) 环境与参数规则
- 部署依赖根目录 `.env`(由脚本中 `env_file``ENV_FILE` 传入)。
- `COMPOSE_PROJECT_NAME``NUXT_ENV`、服务名列表需保持可追踪且与 compose 一致。
- 若新增服务,需同步:
- `docker/docker-compose.yaml`
- 部署脚本中的 build/up 目标
- 必要时更新 workflow 说明
## 5) 验证建议
- 语法检查:
- `bash -n deploy/deploy.sh`
- `bash -n deploy/deploy_staging.sh`
- 变更前后做一次 `docker compose config` 思维核对(服务与 profile 是否正确)。
## 6) 输出要求
- 明确:影响环境(预发/正式)、影响服务、是否可能重建容器。
- 必填:回滚路径(例如切回上一 commit 并重新执行部署脚本)。

View File

@@ -30,62 +30,62 @@ services:
- dev_local_backend
- prod
# OpenSearch Service
opensearch:
user: "1000:1000"
build:
context: .
dockerfile: opensearch.Dockerfile
container_name: ${COMPOSE_PROJECT_NAME}-opensearch
environment:
- cluster.name=os-single
- node.name=os-node-1
- discovery.type=single-node
- bootstrap.memory_lock=true
- OPENSEARCH_JAVA_OPTS=-Xms1g -Xmx1g
- DISABLE_SECURITY_PLUGIN=true
- cluster.blocks.create_index=false
ulimits:
memlock: { soft: -1, hard: -1 }
nofile: { soft: 65536, hard: 65536 }
volumes:
- opensearch-data:/usr/share/opensearch/data
- opensearch-snapshots:/snapshots
ports:
- "${OPENSEARCH_PORT:-9200}:9200"
- "${OPENSEARCH_METRICS_PORT:-9600}:9600"
restart: unless-stopped
healthcheck:
test:
- CMD-SHELL
- curl -fsS http://127.0.0.1:9200/_cluster/health >/dev/null
interval: 10s
timeout: 5s
retries: 30
start_period: 60s
networks:
- openisle-network
profiles:
- dev
- dev_local_backend
# # OpenSearch Service
# opensearch:
# user: "1000:1000"
# build:
# context: .
# dockerfile: opensearch.Dockerfile
# container_name: ${COMPOSE_PROJECT_NAME}-opensearch
# environment:
# - cluster.name=os-single
# - node.name=os-node-1
# - discovery.type=single-node
# - bootstrap.memory_lock=true
# - OPENSEARCH_JAVA_OPTS=-Xms1g -Xmx1g
# - DISABLE_SECURITY_PLUGIN=true
# - cluster.blocks.create_index=false
# ulimits:
# memlock: { soft: -1, hard: -1 }
# nofile: { soft: 65536, hard: 65536 }
# volumes:
# - opensearch-data:/usr/share/opensearch/data
# - opensearch-snapshots:/snapshots
# ports:
# - "${OPENSEARCH_PORT:-9200}:9200"
# - "${OPENSEARCH_METRICS_PORT:-9600}:9600"
# restart: unless-stopped
# healthcheck:
# test:
# - CMD-SHELL
# - curl -fsS http://127.0.0.1:9200/_cluster/health >/dev/null
# interval: 10s
# timeout: 5s
# retries: 30
# start_period: 60s
# networks:
# - openisle-network
# profiles:
# - dev
# - dev_local_backend
dashboards:
image: opensearchproject/opensearch-dashboards:3.0.0
container_name: ${COMPOSE_PROJECT_NAME}-os-dashboards
environment:
OPENSEARCH_HOSTS: '["http://opensearch:9200"]'
DISABLE_SECURITY_DASHBOARDS_PLUGIN: "true"
ports:
- "${OPENSEARCH_DASHBOARDS_PORT:-5601}:5601"
depends_on:
- opensearch
restart: unless-stopped
networks:
- openisle-network
profiles:
- dev
- dev_local_backend
- prod
# dashboards:
# image: opensearchproject/opensearch-dashboards:3.0.0
# container_name: ${COMPOSE_PROJECT_NAME}-os-dashboards
# environment:
# OPENSEARCH_HOSTS: '["http://opensearch:9200"]'
# DISABLE_SECURITY_DASHBOARDS_PLUGIN: "true"
# ports:
# - "${OPENSEARCH_DASHBOARDS_PORT:-5601}:5601"
# depends_on:
# - opensearch
# restart: unless-stopped
# networks:
# - openisle-network
# profiles:
# - dev
# - dev_local_backend
# - prod
rabbitmq:
image: rabbitmq:3.13-management
@@ -200,7 +200,6 @@ services:
- openisle-network
profiles:
- dev
- dev_local_backend
- prod

40
docs/AGENTS.md Normal file
View File

@@ -0,0 +1,40 @@
# Docs 协作指引
## 1) 适用范围
- 作用于 `docs/` 目录及其子目录。
- 文档需服务“开发者真实使用”,优先准确性与可执行性。
## 2) 文档架构
- 内容目录:`content/docs/`
- 生成脚本:`scripts/generate-docs.ts`
- OpenAPI 输入配置:`lib/openapi.ts`
- 前端框架Fumadocs + Next.jsBun 工具链)
## 3) 编辑规则
- 优先修正“与代码不一致”的文档,不复制过时描述。
- 涉及技术栈说明时,以当前代码为准(例如后端为 JPA/Repository
- OpenAPI 自动生成目录(`content/docs/openapi/(generated)`)不要手工细改,改源头配置与脚本。
- 结构性改动优先维持导航稳定(`meta.json` 与已有 slug
## 4) OpenAPI 同步规则
- 后端 API 变更后,应重新生成文档页面:
- `bun run generate`
- 若接口来源地址或文档聚合策略变化,更新:
- `lib/openapi.ts`
- `scripts/generate-docs.ts`
## 5) 验证命令
- 安装依赖:`bun install`
- 生成 API 文档:`bun run generate`
- 构建校验:`bun run build`
- 本地预览:`bun dev`
## 6) 输出要求
- 说明更新了哪些文档入口backend/frontend/openapi
- 说明是否需要后端先部署后再刷新文档产物。

54
frontend_nuxt/AGENTS.md Normal file
View File

@@ -0,0 +1,54 @@
# FrontendNuxt协作指引
## 1) 适用范围
- 作用于 `frontend_nuxt/` 目录及其子目录。
- 本文件仅覆盖前端范围;跨服务规则仍遵循根 `AGENTS.md`
## 2) 代码组织约定
- `pages/`:路由页面与页面级数据获取。
- `components/`:可复用视图组件。
- `composables/`:状态与行为复用(如 WebSocket、倒计时
- `plugins/`:运行时插件(鉴权 fetch、主题、第三方库注入
- `utils/`:纯工具函数(时间、鉴权 token、平台适配
- `assets/``public/`:静态资源与样式。
## 3) 前端修改规则
- 优先保持现有交互和视觉风格一致,不做无关 UI 重构。
- 接口字段变更时,先更新调用点,再统一处理回退逻辑与空值分支。
- SSR 与客户端代码分离:
- 涉及 `window``localStorage`、WebSocket 的逻辑只在 client 侧运行。
- 与鉴权相关的 401 处理,保持与 `plugins/auth-fetch.client.ts` 行为一致。
## 4) 环境变量与运行时配置
- 统一通过 `nuxt.config.ts``runtimeConfig.public` 读取。
- 关键键位保持一致:
- `NUXT_PUBLIC_API_BASE_URL`
- `NUXT_PUBLIC_WEBSOCKET_URL`
- `NUXT_PUBLIC_WEBSITE_BASE_URL`
- 变量改动需同步根目录 `.env.example` 与文档说明。
## 5) 实时消息链路注意事项
- WebSocket 入口:
- `composables/useWebSocket.js`
- 若改订阅目标(`/topic/...``/user/...`),必须与后端推送目的地保持一致。
- 重连与重订阅逻辑不可被破坏;避免引入重复订阅和泄漏。
## 6) 构建与验证
- 标准验证:`npm run build`
- 本地联调:`npm run dev`
- 涉及 WebSocket/通知的改动,建议至少手工验证:
- 登录后连接建立
- 收到消息时 UI 状态更新
- 断线重连后仍可订阅
## 7) 输出要求
- 标注影响页面/组件路径。
- 标注是否引入 API 字段兼容处理。
- 标注是否需要后端或 WebSocket 服务配合发布。

View File

@@ -16,6 +16,7 @@
<div class="info-content-header-left">
<span class="user-name">{{ comment.userName }}</span>
<span v-if="isCommentFromPostAuthor" class="op-badge" title="楼主">OP</span>
<span v-if="comment.isBot" class="bot-badge" title="Bot">Bot</span>
<medal-one class="medal-icon" />
<NuxtLink
v-if="comment.medal"
@@ -522,6 +523,21 @@ const handleContentClick = (e) => {
line-height: 1;
}
.bot-badge {
display: inline-flex;
align-items: center;
justify-content: center;
margin-left: 6px;
padding: 0 6px;
height: 18px;
border-radius: 9px;
background-color: rgba(76, 175, 80, 0.16);
color: #2e7d32;
font-size: 12px;
font-weight: 600;
line-height: 1;
}
.medal-icon {
font-size: 12px;
opacity: 0.6;

View File

@@ -0,0 +1,149 @@
<template>
<div class="timeline-container">
<div class="timeline-header">
<div class="timeline-title">浏览了文章</div>
<div class="timeline-date">{{ formattedDate }}</div>
</div>
<div class="article-container">
<NuxtLink :to="postLink" class="timeline-article-link">
{{ item.post?.title }}
</NuxtLink>
<div class="timeline-snippet">
{{ strippedSnippet }}
</div>
<div class="article-meta" v-if="hasMeta">
<ArticleCategory v-if="item.post?.category" :category="item.post.category" />
<ArticleTags :tags="item.post?.tags" />
<div class="article-comment-count" v-if="item.post?.commentCount !== undefined">
<comment-one class="article-comment-count-icon" />
<span>{{ item.post?.commentCount }}</span>
</div>
</div>
</div>
</div>
</template>
<script setup>
import { computed } from 'vue'
import { stripMarkdown } from '~/utils/markdown'
import TimeManager from '~/utils/time'
const props = defineProps({
item: { type: Object, required: true },
})
const postLink = computed(() => {
const id = props.item.post?.id
return id ? `/posts/${id}` : '#'
})
const formattedDate = computed(() =>
TimeManager.format(props.item.lastReadAt ?? props.item.createdAt),
)
const strippedSnippet = computed(() => stripMarkdown(props.item.post?.snippet ?? ''))
const hasMeta = computed(() => {
const tags = props.item.post?.tags ?? []
const hasTags = Array.isArray(tags) && tags.length > 0
const hasCategory = !!props.item.post?.category
const hasCommentCount =
props.item.post?.commentCount !== undefined && props.item.post?.commentCount !== null
return hasTags || hasCategory || hasCommentCount
})
</script>
<style scoped>
.timeline-container {
display: flex;
flex-direction: column;
padding-top: 5px;
gap: 12px;
border-radius: 10px;
background: var(--timeline-card-background, transparent);
}
.timeline-header {
display: flex;
justify-content: space-between;
align-items: center;
}
.timeline-title {
font-size: 16px;
font-weight: 600;
}
.timeline-date {
font-size: 12px;
color: var(--timeline-date-color, #888);
}
.article-container {
display: flex;
flex-direction: column;
border: 1px solid var(--normal-border-color);
border-radius: 10px;
padding: 10px;
gap: 6px;
}
.timeline-article-link {
font-size: 18px;
font-weight: 600;
color: var(--link-color);
text-decoration: none;
}
.timeline-article-link:hover {
text-decoration: underline;
}
.timeline-snippet {
color: var(--timeline-snippet-color, #666);
font-size: 14px;
line-height: 1.6;
}
.article-meta {
display: flex;
flex-wrap: wrap;
gap: 8px;
align-items: center;
}
.article-tags {
display: flex;
flex-wrap: wrap;
gap: 6px;
}
.article-tag {
background-color: var(--article-info-background-color);
border-radius: 6px;
padding: 2px 6px;
font-size: 12px;
color: var(--text-color);
}
.article-comment-count {
display: inline-flex;
align-items: center;
gap: 4px;
font-size: 12px;
color: var(--text-color);
}
.article-comment-count-icon {
width: 16px;
height: 16px;
}
@media (max-width: 768px) {
.timeline-article-link {
font-size: 16px;
}
.timeline-snippet {
font-size: 13px;
}
}
</style>

View File

@@ -159,6 +159,12 @@ const selectedTags = ref([])
const route = useRoute()
const tagOptions = ref([])
const categoryOptions = ref([])
const clearFilters = () => {
selectedCategory.value = ''
selectedTags.value = []
selectedCategoryGlobal.value = ''
selectedTagsGlobal.value = []
}
const topics = ref(['最新回复', '最新', '精选', '排行榜' /*, '热门', '类别'*/])
const selectedTopicCookie = useCookie('homeTab')
@@ -218,8 +224,18 @@ watch(
(query) => {
const category = query.category
const tags = query.tags
category && selectedCategorySet(category)
tags && selectedTagsSet(tags)
if (category) {
selectedCategorySet(category)
} else {
selectedCategory.value = ''
selectedCategoryGlobal.value = ''
}
if (tags) {
selectedTagsSet(tags)
} else {
selectedTags.value = []
selectedTagsGlobal.value = []
}
},
)
@@ -367,12 +383,18 @@ watch(selectedTopic, (val) => {
if (import.meta.server) {
await loadOptions()
}
const handleRefreshHome = () => {
clearFilters()
refreshFirst()
}
onMounted(() => {
if (categoryOptions.value.length === 0 && tagOptions.value.length === 0) loadOptions()
window.addEventListener('refresh-home', refreshFirst)
window.addEventListener('refresh-home', handleRefreshHome)
})
onBeforeUnmount(() => {
window.removeEventListener('refresh-home', refreshFirst)
window.removeEventListener('refresh-home', handleRefreshHome)
})
/** 供 InfiniteLoadMore 重建用的 key筛选/Tab 改变即重建内部状态 */

View File

@@ -58,12 +58,15 @@ const submitLogin = async () => {
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ username: username.value, password: password.value }),
})
const data = await res.json()
const data = await res.json().catch(() => ({}))
if (res.ok && data.token) {
setToken(data.token)
toast.success('登录成功')
registerPush()
await navigateTo('/', { replace: true })
} else if (data.reason_code === 'EMAIL_SEND_FAILED') {
const msg = data.error || data.message || res.statusText || '登录失败'
toast.error(`${res.status} ${msg} (${data.reason_code})`)
} else if (data.reason_code === 'NOT_VERIFIED') {
toast.info('当前邮箱未验证,已经为您重新发送验证码')
await navigateTo(
@@ -76,10 +79,12 @@ const submitLogin = async () => {
} else if (data.reason_code === 'NOT_APPROVED') {
await navigateTo({ path: '/signup-reason', query: { token: data.token } }, { replace: true })
} else {
toast.error(data.error || '登录失败')
const msg = data.error || data.message || res.statusText || '登录失败'
const reason = data.reason_code ? ` (${data.reason_code})` : ''
toast.error(`${res.status} ${msg}${reason}`)
}
} catch (e) {
toast.error('登录失败')
toast.error(`登录失败: ${e.message}`)
} finally {
isWaitingForLogin.value = false
}

View File

@@ -181,6 +181,12 @@
<PostChangeLogItem v-else :log="item" :title="title" />
</template>
</BaseTimeline>
<InfiniteLoadMore
v-if="timelineItems.length > 0"
:key="commentSort"
:on-load="loadMoreTimeline"
:pause="isLoadingMoreComments || isFetchingComments"
/>
</div>
</div>
@@ -211,6 +217,7 @@ import CommentEditor from '~/components/CommentEditor.vue'
import BaseTimeline from '~/components/BaseTimeline.vue'
import BasePlaceholder from '~/components/BasePlaceholder.vue'
import PostChangeLogItem from '~/components/PostChangeLogItem.vue'
import InfiniteLoadMore from '~/components/InfiniteLoadMore.vue'
import ArticleTags from '~/components/ArticleTags.vue'
import ArticleCategory from '~/components/ArticleCategory.vue'
import ReactionsGroup from '~/components/ReactionsGroup.vue'
@@ -276,6 +283,10 @@ const currentIndex = ref(1)
const subscribed = ref(false)
const commentSort = ref('NEWEST')
const isFetchingComments = ref(false)
const commentPage = ref(0)
const commentPageSize = 10
const hasMoreComments = ref(true)
const isLoadingMoreComments = ref(false)
const isMobile = useIsMobile()
const timelineItems = ref([])
@@ -377,6 +388,7 @@ const mapComment = (
text: c.content,
reactions: c.reactions || [],
pinned: Boolean(c.pinned ?? c.pinnedAt ?? c.pinned_at),
isBot: Boolean(c.author?.bot),
reply: (c.replies || []).map((r) =>
mapComment(r, c.author.username, c.author.avatar, c.author.id, level + 1),
),
@@ -532,7 +544,7 @@ const {
} catch (err) {}
},
{
server: false,
server: true,
lazy: false,
},
)
@@ -868,17 +880,33 @@ const fetchCommentSorts = () => {
])
}
const fetchCommentsAndChangeLog = async () => {
isFetchingComments.value = true
console.info('Fetching comments and chang log', { postId, sort: commentSort.value })
const fetchCommentsAndChangeLog = async ({ pageNo = 0, append = false } = {}) => {
if (isLoadingMoreComments.value) return false
if (!append) {
hasMoreComments.value = true
commentPage.value = 0
}
if (pageNo === 0) {
isFetchingComments.value = true
} else {
isLoadingMoreComments.value = true
}
console.info('Fetching comments and chang log', {
postId,
sort: commentSort.value,
page: pageNo,
pageSize: commentPageSize,
})
let done = false
try {
const token = getToken()
const res = await fetch(
`${API_BASE_URL}/api/posts/${postId}/comments?sort=${commentSort.value}`,
{
headers: { Authorization: token ? `Bearer ${token}` : '' },
},
)
const url = new URL(`${API_BASE_URL}/api/posts/${postId}/comments`)
url.searchParams.set('sort', commentSort.value)
url.searchParams.set('page', String(pageNo))
url.searchParams.set('pageSize', String(commentPageSize))
const res = await fetch(url.toString(), {
headers: { Authorization: token ? `Bearer ${token}` : '' },
})
console.info('Fetch comments response status', res.status)
if (res.ok) {
const data = await res.json()
@@ -901,23 +929,48 @@ const fetchCommentsAndChangeLog = async () => {
}
}
comments.value = commentList
changeLogs.value = changeLogList
timelineItems.value = newTimelineItemList
if (append) {
comments.value.push(...commentList)
changeLogs.value.push(...changeLogList)
timelineItems.value.push(...newTimelineItemList)
commentPage.value = pageNo
} else {
comments.value = commentList
changeLogs.value = changeLogList
timelineItems.value = newTimelineItemList
commentPage.value = 0
}
isFetchingComments.value = false
done = data.length < commentPageSize
hasMoreComments.value = !done
await nextTick()
gatherPostItems()
return done
}
} catch (e) {
console.debug('Fetch comments error', e)
hasMoreComments.value = false
return true
} finally {
isFetchingComments.value = false
isLoadingMoreComments.value = false
}
}
const fetchTimeline = async () => {
await fetchCommentsAndChangeLog()
hasMoreComments.value = true
commentPage.value = 0
comments.value = []
changeLogs.value = []
timelineItems.value = []
await fetchCommentsAndChangeLog({ pageNo: 0, append: false })
}
const loadMoreTimeline = async () => {
if (!hasMoreComments.value || isLoadingMoreComments.value) return true
const nextPage = commentPage.value + 1
const done = await fetchCommentsAndChangeLog({ pageNo: nextPage, append: true })
return done || !hasMoreComments.value
}
watch(commentSort, async () => {
@@ -928,8 +981,17 @@ const jumpToHashComment = async () => {
const hash = location.hash
if (hash.startsWith('#comment-')) {
const id = hash.substring('#comment-'.length)
await new Promise((resolve) => setTimeout(resolve, 500))
const el = document.getElementById('comment-' + id)
await new Promise((resolve) => setTimeout(resolve, 300))
let el = document.getElementById('comment-' + id)
// 若未加载到目标评论,尝试继续分页加载直到找到或无更多
while (!el && hasMoreComments.value) {
const done = await loadMoreTimeline()
await nextTick()
el = document.getElementById('comment-' + id)
if (done) break
}
if (el) {
const top = el.getBoundingClientRect().top + window.scrollY - headerHeight - 20 // 20 for beauty
window.scrollTo({ top, behavior: 'smooth' })
@@ -1394,10 +1456,6 @@ onMounted(async () => {
font-weight: bold;
}
.reaction-action.copy-link:hover {
background-color: #e2e2e2;
}
.comment-editor-wrapper {
position: relative;
}

View File

@@ -139,8 +139,7 @@ const sendVerification = async () => {
inviteToken: inviteToken.value,
}),
})
isWaitingForEmailSent.value = false
const data = await res.json()
const data = await res.json().catch(() => ({}))
if (res.ok) {
emailStep.value = 1
toast.success('验证码已发送,请查看邮箱')
@@ -149,10 +148,14 @@ const sendVerification = async () => {
if (data.field === 'email') emailError.value = data.error
if (data.field === 'password') passwordError.value = data.error
} else {
toast.error(data.error || '发送失败')
const msg = data.error || data.message || res.statusText || '发送失败'
const reason = data.reason_code ? ` (${data.reason_code})` : ''
toast.error(`${res.status} ${msg}${reason}`)
}
} catch (e) {
toast.error('发送失败')
toast.error(`发送失败: ${e.message}`)
} finally {
isWaitingForEmailSent.value = false
}
}

View File

@@ -191,14 +191,25 @@
>
评论和回复
</div>
<div
v-if="isMine"
:class="['timeline-tab-item', { selected: timelineFilter === 'reads' }]"
@click="timelineFilter = 'reads'"
>
浏览记录
</div>
</div>
<BasePlaceholder
v-if="filteredTimelineItems.length === 0"
text="暂无时间线"
v-if="
timelineFilter === 'reads'
? readPosts.length === 0
: filteredTimelineItems.length === 0
"
:text="timelineFilter === 'reads' ? '暂无浏览记录' : '暂无时间线'"
icon="inbox"
/>
<div class="timeline-list">
<BaseTimeline :items="filteredTimelineItems">
<BaseTimeline v-if="timelineFilter !== 'reads'" :items="filteredTimelineItems">
<template #item="{ item }">
<template v-if="item.type === 'post'">
<TimelinePostItem :item="item" />
@@ -214,6 +225,11 @@
</template>
</template>
</BaseTimeline>
<BaseTimeline v-else :items="readPosts">
<template #item="{ item }">
<TimelineReadItem :item="item" />
</template>
</BaseTimeline>
</div>
</div>
@@ -276,6 +292,7 @@ import BaseTabs from '~/components/BaseTabs.vue'
import LevelProgress from '~/components/LevelProgress.vue'
import TimelineCommentGroup from '~/components/TimelineCommentGroup.vue'
import TimelinePostItem from '~/components/TimelinePostItem.vue'
import TimelineReadItem from '~/components/TimelineReadItem.vue'
import TimelineTagItem from '~/components/TimelineTagItem.vue'
import BaseUserAvatar from '~/components/BaseUserAvatar.vue'
import UserList from '~/components/UserList.vue'
@@ -299,12 +316,15 @@ const hotReplies = ref([])
const hotTags = ref([])
const favoritePosts = ref([])
const timelineItems = ref([])
const readPosts = ref([])
const timelineFilter = ref('all')
const filteredTimelineItems = computed(() => {
if (timelineFilter.value === 'articles') {
return timelineItems.value.filter((item) => item.type === 'post')
} else if (timelineFilter.value === 'comments') {
return timelineItems.value.filter((item) => item.type === 'comment' || item.type === 'reply')
} else if (timelineFilter.value === 'reads') {
return []
}
return timelineItems.value
})
@@ -477,6 +497,27 @@ const fetchTimeline = async () => {
timelineItems.value = combineDiscussionItems(mapped)
}
const fetchReadHistory = async () => {
if (!isMine.value) {
readPosts.value = []
return
}
const token = getToken()
if (!token) {
readPosts.value = []
return
}
const res = await fetch(`${API_BASE_URL}/api/users/${username}/read-posts?limit=50`, {
headers: { Authorization: `Bearer ${token}` },
})
if (res.ok) {
const data = await res.json()
readPosts.value = data.map((r) => ({ ...r, icon: 'file-text' }))
} else {
readPosts.value = []
}
}
const fetchFollowUsers = async () => {
const [followerRes, followingRes] = await Promise.all([
fetch(`${API_BASE_URL}/api/users/${username}/followers`),
@@ -508,6 +549,12 @@ const loadTimeline = async () => {
tabLoading.value = false
}
const loadReadHistory = async () => {
tabLoading.value = true
await fetchReadHistory()
tabLoading.value = false
}
const loadFollow = async () => {
tabLoading.value = true
await fetchFollowUsers()
@@ -624,8 +671,14 @@ onMounted(init)
watch(selectedTab, async (val) => {
// navigateTo({ query: { ...route.query, tab: val } }, { replace: true })
if (val === 'timeline' && timelineItems.value.length === 0) {
await loadTimeline()
if (val === 'timeline') {
if (timelineFilter.value === 'reads') {
if (readPosts.value.length === 0) {
await loadReadHistory()
}
} else if (timelineItems.value.length === 0) {
await loadTimeline()
}
} else if (val === 'following' && followers.value.length === 0 && followings.value.length === 0) {
await loadFollow()
} else if (val === 'favorites' && favoritePosts.value.length === 0) {
@@ -634,6 +687,23 @@ watch(selectedTab, async (val) => {
await loadAchievements()
}
})
watch(timelineFilter, async (val) => {
if (selectedTab.value !== 'timeline') return
if (val === 'reads') {
if (readPosts.value.length === 0) {
await loadReadHistory()
}
} else if (timelineItems.value.length === 0) {
await loadTimeline()
}
})
watch(isMine, (val) => {
if (!val && timelineFilter.value === 'reads') {
timelineFilter.value = 'all'
}
})
</script>
<style scoped>

View File

@@ -82,6 +82,56 @@ export async function markNotificationsRead(ids) {
}
}
const MARK_ALL_FETCH_SIZE = 100
const MARK_ALL_CHUNK_SIZE = 200
const MARK_ALL_MAX_PAGES = 200
async function fetchUnreadNotificationsPage(page, size) {
const config = useRuntimeConfig()
const API_BASE_URL = config.public.apiBaseUrl
const token = getToken()
if (!token) throw new Error('NO_TOKEN')
const res = await fetch(`${API_BASE_URL}/api/notifications/unread?page=${page}&size=${size}`, {
headers: {
Authorization: `Bearer ${token}`,
},
})
if (!res.ok) throw new Error('FETCH_UNREAD_FAILED')
const data = await res.json()
return Array.isArray(data) ? data : []
}
async function collectUnreadNotificationIds(excludedTypes = []) {
const excludedTypeSet = new Set(excludedTypes)
const ids = []
for (let page = 0; page < MARK_ALL_MAX_PAGES; page++) {
const pageData = await fetchUnreadNotificationsPage(page, MARK_ALL_FETCH_SIZE)
if (pageData.length === 0) break
for (const notification of pageData) {
if (!notification || excludedTypeSet.has(notification.type)) continue
if (typeof notification.id !== 'number') continue
ids.push(notification.id)
}
if (pageData.length < MARK_ALL_FETCH_SIZE) break
}
return [...new Set(ids)]
}
async function markNotificationsReadInChunks(ids) {
for (let i = 0; i < ids.length; i += MARK_ALL_CHUNK_SIZE) {
const chunk = ids.slice(i, i + MARK_ALL_CHUNK_SIZE)
const ok = await markNotificationsRead(chunk)
if (!ok) return false
}
return true
}
export async function fetchNotificationPreferences() {
try {
const config = useRuntimeConfig()
@@ -390,29 +440,37 @@ function createFetchNotifications() {
}
const markAllRead = async () => {
// 除 REGISTER_REQUEST 类型消息
const idsToMark = notifications.value
// 为了覆盖分页中的全部未读,先从后端分页拉取全部未读 ID除 REGISTER_REQUEST)。
const localIdsToMark = notifications.value
.filter((n) => n.type !== 'REGISTER_REQUEST' && !n.read)
.map((n) => n.id)
if (idsToMark.length === 0) return
notifications.value.forEach((n) => {
if (n.type !== 'REGISTER_REQUEST') n.read = true
})
notificationState.unreadCount = notifications.value.filter((n) => !n.read).length
const ok = await markNotificationsRead(idsToMark)
if (!ok) {
try {
const idsToMark = await collectUnreadNotificationIds(['REGISTER_REQUEST'])
if (idsToMark.length > 0) {
const ok = await markNotificationsReadInChunks(idsToMark)
if (!ok) throw new Error('MARK_READ_FAILED')
}
await fetchUnreadCount()
if (authState.role === 'ADMIN') {
toast.success('已读所有消息(注册请求除外)')
} else {
toast.success('已读所有消息')
}
} catch (e) {
notifications.value.forEach((n) => {
if (idsToMark.includes(n.id)) n.read = false
if (localIdsToMark.includes(n.id)) n.read = false
})
await fetchUnreadCount()
toast.error('已读操作失败,请稍后重试')
return
}
fetchUnreadCount()
if (authState.role === 'ADMIN') {
toast.success('已读所有消息(注册请求除外)')
} else {
toast.success('已读所有消息')
}
}
return {
fetchNotifications,

51
mcp/AGENTS.md Normal file
View File

@@ -0,0 +1,51 @@
# MCP 服务协作指引
## 1) 适用范围
- 作用于 `mcp/` 目录及其子目录。
- 本模块对外提供 MCP tools接口兼容性要求高。
## 2) 模块结构
- `src/openisle_mcp/server.py`Tool 定义与请求处理入口。
- `src/openisle_mcp/search_client.py`:调用 OpenIsle 后端 HTTP API。
- `src/openisle_mcp/schemas.py`Pydantic 数据契约。
- `src/openisle_mcp/config.py`:运行配置与环境变量读取。
## 3) 变更原则
- Tool 名称默认视为稳定契约,非必要不重命名。
- 后端接口字段变化时,优先同步 `schemas.py`,再调整 `server.py` 映射。
- 对认证接口保持“显式失败”:
- 缺 token、401、403 需给出可理解错误信息。
- 避免吞掉异常上下文保留足够定位信息HTTP 状态、接口语义)。
## 4) 与后端契约同步
- 高风险同步点:
- `create_post`
- `reply_to_post`
- `reply_to_comment`
- `list_unread_messages`
- `mark_notifications_read`
- 若后端响应结构改动,需同步:
- `search_client.py` 的解析逻辑
- `schemas.py` 的校验模型
- `README.md` 的 tool 说明(如有新增/删减)
## 5) 配置规则
- 环境变量统一使用 `OPENISLE_MCP_*` 前缀。
- 保持默认值可本地运行(如 `http://localhost:8080` 场景)。
- 不在代码中硬编码私密 token。
## 6) 验证建议
- 安装校验:`python -m pip install -e .`
- 启动校验:`openisle-mcp`(或项目内等价启动方式)
- 如改动 schema/解析逻辑,至少完成一次真实后端联调请求。
## 7) 输出要求
- 说明变更是否影响 tool 输入/输出契约。
- 说明是否要求调用方更新(参数名、字段、错误语义)。

View File

@@ -31,9 +31,12 @@ By default the server listens on port `8085` and serves MCP over Streamable HTTP
| Tool | Description |
| --- | --- |
| `search` | Perform a global search against the OpenIsle backend. |
| `create_post` | Publish a new post using a JWT token. |
| `reply_to_post` | Create a new comment on a post using a JWT token. |
| `reply_to_comment` | Reply to an existing comment using a JWT token. |
| `recent_posts` | Retrieve posts created within the last *N* minutes. |
The tools return structured data mirroring the backend DTOs, including highlighted snippets for
search results, the full comment payload for replies, and detailed metadata for recent posts.
search results, the full comment payload for post replies and comment replies, and detailed
metadata for recent posts.

View File

@@ -5,7 +5,7 @@ from __future__ import annotations
from functools import lru_cache
from typing import Literal
from pydantic import Field
from pydantic import Field, SecretStr
from pydantic.networks import AnyHttpUrl
from pydantic_settings import BaseSettings, SettingsConfigDict
@@ -36,6 +36,20 @@ class Settings(BaseSettings):
gt=0,
description="Timeout (seconds) for backend search requests.",
)
access_token: SecretStr | None = Field(
default=None,
description=(
"Optional JWT bearer token used for authenticated backend calls. "
"When set, tools that support authentication will use this token "
"automatically unless an explicit token override is provided."
),
)
log_level: str = Field(
"INFO",
description=(
"Logging level for the MCP server (e.g. DEBUG, INFO, WARNING)."
),
)
model_config = SettingsConfigDict(
env_prefix="OPENISLE_MCP_",

View File

@@ -5,7 +5,7 @@ from __future__ import annotations
from datetime import datetime
from typing import Any, Optional
from pydantic import BaseModel, Field, ConfigDict
from pydantic import BaseModel, Field, ConfigDict, field_validator
class SearchResultItem(BaseModel):
@@ -170,6 +170,15 @@ class CommentData(BaseModel):
model_config = ConfigDict(populate_by_name=True, extra="allow")
@field_validator("replies", "reactions", mode="before")
@classmethod
def _ensure_comment_lists(cls, value: Any) -> list[Any]:
"""Convert ``None`` payloads to empty lists for comment collections."""
if value is None:
return []
return value
class CommentReplyResult(BaseModel):
"""Structured response returned when replying to a comment."""
@@ -177,6 +186,18 @@ class CommentReplyResult(BaseModel):
comment: CommentData = Field(description="Reply comment returned by the backend.")
class CommentCreateResult(BaseModel):
"""Structured response returned when creating a comment on a post."""
comment: CommentData = Field(description="Comment returned by the backend.")
class PostCreateResult(BaseModel):
"""Structured response returned when creating a new post."""
post: PostDetail = Field(description="Detailed post payload returned by the backend.")
class PostSummary(BaseModel):
"""Summary information for a post."""
@@ -247,6 +268,15 @@ class PostSummary(BaseModel):
model_config = ConfigDict(populate_by_name=True, extra="allow")
@field_validator("tags", "reactions", "participants", mode="before")
@classmethod
def _ensure_post_lists(cls, value: Any) -> list[Any]:
"""Normalize ``None`` values returned by the backend to empty lists."""
if value is None:
return []
return value
class RecentPostsResponse(BaseModel):
"""Structured response for the recent posts tool."""
@@ -260,3 +290,89 @@ class RecentPostsResponse(BaseModel):
CommentData.model_rebuild()
class PostDetail(PostSummary):
"""Detailed information for a single post, including comments."""
comments: list[CommentData] = Field(
default_factory=list,
description="Comments that belong to the post.",
)
model_config = ConfigDict(populate_by_name=True, extra="allow")
@field_validator("comments", mode="before")
@classmethod
def _ensure_comments_list(cls, value: Any) -> list[Any]:
"""Treat ``None`` comments payloads as empty lists."""
if value is None:
return []
return value
class NotificationData(BaseModel):
"""Unread notification payload returned by the backend."""
id: Optional[int] = Field(default=None, description="Notification identifier.")
type: Optional[str] = Field(default=None, description="Type of the notification.")
post: Optional[PostSummary] = Field(
default=None, description="Post associated with the notification if applicable."
)
comment: Optional[CommentData] = Field(
default=None, description="Comment referenced by the notification when available."
)
parent_comment: Optional[CommentData] = Field(
default=None,
alias="parentComment",
description="Parent comment for nested replies, when present.",
)
from_user: Optional[AuthorInfo] = Field(
default=None,
alias="fromUser",
description="User who triggered the notification.",
)
reaction_type: Optional[str] = Field(
default=None,
alias="reactionType",
description="Reaction type for reaction-based notifications.",
)
content: Optional[str] = Field(
default=None, description="Additional content or message for the notification."
)
approved: Optional[bool] = Field(
default=None, description="Approval status for moderation notifications."
)
read: Optional[bool] = Field(default=None, description="Whether the notification is read.")
created_at: Optional[datetime] = Field(
default=None,
alias="createdAt",
description="Timestamp when the notification was created.",
)
model_config = ConfigDict(populate_by_name=True, extra="allow")
class UnreadNotificationsResponse(BaseModel):
"""Structured response for unread notification queries."""
page: int = Field(description="Requested page index for the unread notifications.")
size: int = Field(description="Requested page size for the unread notifications.")
total: int = Field(description="Number of unread notifications returned in this page.")
notifications: list[NotificationData] = Field(
default_factory=list,
description="Unread notifications returned by the backend.",
)
class NotificationCleanupResult(BaseModel):
"""Structured response returned after marking notifications as read."""
processed_ids: list[int] = Field(
default_factory=list,
description="Identifiers that were marked as read in the backend.",
)
total_marked: int = Field(
description="Total number of notifications successfully marked as read.",
)

View File

@@ -3,38 +3,109 @@
from __future__ import annotations
import json
import logging
from typing import Any
import httpx
logger = logging.getLogger(__name__)
class SearchClient:
"""Client for calling the OpenIsle HTTP APIs used by the MCP server."""
def __init__(self, base_url: str, *, timeout: float = 10.0) -> None:
def __init__(
self,
base_url: str,
*,
timeout: float = 10.0,
access_token: str | None = None,
) -> None:
self._base_url = base_url.rstrip("/")
self._timeout = timeout
self._client: httpx.AsyncClient | None = None
self._access_token = self._sanitize_token(access_token)
def _get_client(self) -> httpx.AsyncClient:
if self._client is None:
self._client = httpx.AsyncClient(base_url=self._base_url, timeout=self._timeout)
logger.debug(
"Creating httpx.AsyncClient for base URL %s with timeout %.2fs",
self._base_url,
self._timeout,
)
self._client = httpx.AsyncClient(
base_url=self._base_url,
timeout=self._timeout,
)
return self._client
@staticmethod
def _sanitize_token(token: str | None) -> str | None:
if token is None:
return None
stripped = token.strip()
return stripped or None
def update_access_token(self, token: str | None) -> None:
"""Update the default access token used for authenticated requests."""
self._access_token = self._sanitize_token(token)
if self._access_token:
logger.debug("Configured default access token for SearchClient requests.")
else:
logger.debug("Cleared default access token for SearchClient requests.")
def _resolve_token(self, token: str | None) -> str | None:
candidate = self._sanitize_token(token)
if candidate is not None:
return candidate
return self._access_token
def _require_token(self, token: str | None) -> str:
resolved = self._resolve_token(token)
if resolved is None:
raise ValueError(
"Authenticated request requires an access token. Provide a Bearer token "
"via the MCP Authorization header or configure a default token for the server."
)
return resolved
def _build_headers(
self,
*,
token: str,
accept: str = "application/json",
include_json: bool = False,
) -> dict[str, str]:
headers: dict[str, str] = {"Accept": accept}
resolved = self._resolve_token(token)
if resolved:
headers["Authorization"] = f"Bearer {resolved}"
if include_json:
headers["Content-Type"] = "application/json"
return headers
async def global_search(self, keyword: str) -> list[dict[str, Any]]:
"""Call the global search endpoint and return the parsed JSON payload."""
client = self._get_client()
logger.debug("Calling global search with keyword=%s", keyword)
response = await client.get(
"/api/search/global",
params={"keyword": keyword},
headers={"Accept": "application/json"},
headers=self._build_headers(),
)
response.raise_for_status()
payload = response.json()
if not isinstance(payload, list):
formatted = json.dumps(payload, ensure_ascii=False)[:200]
raise ValueError(f"Unexpected response format from search endpoint: {formatted}")
logger.info(
"Global search returned %d results for keyword '%s'",
len(payload),
keyword,
)
return [self._ensure_dict(entry) for entry in payload]
async def reply_to_comment(
@@ -47,33 +118,101 @@ class SearchClient:
"""Reply to an existing comment and return the created reply."""
client = self._get_client()
headers = {
"Accept": "application/json",
"Content-Type": "application/json",
"Authorization": f"Bearer {token}",
}
resolved_token = self._require_token(token)
headers = self._build_headers(token=resolved_token, include_json=True)
payload: dict[str, Any] = {"content": content}
if captcha is not None:
stripped_captcha = captcha.strip()
if stripped_captcha:
payload["captcha"] = stripped_captcha
logger.debug(
"Posting reply to comment_id=%s (captcha=%s)",
comment_id,
bool(captcha),
)
response = await client.post(
f"/api/comments/{comment_id}/replies",
json=payload,
headers=headers,
)
response.raise_for_status()
return self._ensure_dict(response.json())
body = self._ensure_dict(response.json())
logger.info("Reply to comment_id=%s succeeded with id=%s", comment_id, body.get("id"))
return body
async def reply_to_post(
self,
post_id: int,
token: str,
content: str,
captcha: str | None = None,
) -> dict[str, Any]:
"""Create a comment on a post and return the backend payload."""
client = self._get_client()
resolved_token = self._require_token(token)
headers = self._build_headers(token=resolved_token, include_json=True)
payload: dict[str, Any] = {"content": content}
if captcha is not None:
stripped_captcha = captcha.strip()
if stripped_captcha:
payload["captcha"] = stripped_captcha
logger.debug(
"Posting comment to post_id=%s (captcha=%s)",
post_id,
bool(captcha),
)
response = await client.post(
f"/api/posts/{post_id}/comments",
json=payload,
headers=headers,
)
response.raise_for_status()
body = self._ensure_dict(response.json())
logger.info("Reply to post_id=%s succeeded with id=%s", post_id, body.get("id"))
return body
async def create_post(
self,
payload: dict[str, Any],
*,
token: str,
) -> dict[str, Any]:
"""Create a new post and return the detailed backend payload."""
client = self._get_client()
resolved_token = self._require_token(token)
headers = self._build_headers(token=resolved_token, include_json=True)
logger.debug(
"Creating post with category_id=%s and %d tag(s)",
payload.get("categoryId"),
len(payload.get("tagIds", []) if isinstance(payload.get("tagIds"), list) else []),
)
response = await client.post(
"/api/posts",
json=payload,
headers=headers,
)
response.raise_for_status()
body = self._ensure_dict(response.json())
logger.info("Post creation succeeded with id=%s, token=%s", body.get("id"), token)
return body
async def recent_posts(self, minutes: int) -> list[dict[str, Any]]:
"""Return posts created within the given timeframe."""
client = self._get_client()
logger.debug(
"Fetching recent posts within last %s minutes",
minutes,
)
response = await client.get(
"/api/posts/recent",
params={"minutes": minutes},
headers={"Accept": "application/json"},
headers=self._build_headers(),
)
response.raise_for_status()
payload = response.json()
@@ -82,14 +221,120 @@ class SearchClient:
raise ValueError(
f"Unexpected response format from recent posts endpoint: {formatted}"
)
logger.info(
"Fetched %d recent posts for window=%s minutes",
len(payload),
minutes,
)
return [self._ensure_dict(entry) for entry in payload]
async def get_post(self, post_id: int, token: str | None = None) -> dict[str, Any]:
"""Retrieve the detailed payload for a single post."""
client = self._get_client()
headers = self._build_headers(token=token)
logger.debug("Fetching post details for post_id=%s", post_id)
response = await client.get(f"/api/posts/{post_id}", headers=headers)
response.raise_for_status()
body = self._ensure_dict(response.json())
logger.info(
"Retrieved post_id=%s successfully with %d top-level comments",
post_id,
len(body.get("comments", []) if isinstance(body.get("comments"), list) else []),
)
return body
async def list_unread_notifications(
self,
*,
page: int = 0,
size: int = 30,
token: str,
) -> list[dict[str, Any]]:
"""Return unread notifications for the authenticated user."""
client = self._get_client()
resolved_token = self._require_token(token)
logger.debug(
"Fetching unread notifications with page=%s, size=%s",
page,
size,
)
response = await client.get(
"/api/notifications/unread",
params={"page": page, "size": size},
headers=self._build_headers(token=resolved_token),
)
response.raise_for_status()
payload = response.json()
if not isinstance(payload, list):
formatted = json.dumps(payload, ensure_ascii=False)[:200]
raise ValueError(
"Unexpected response format from unread notifications endpoint: "
f"{formatted}"
)
logger.info(
"Fetched %d unread notifications (page=%s, size=%s)",
len(payload),
page,
size,
)
return [self._ensure_dict(entry) for entry in payload]
async def mark_notifications_read(
self,
ids: list[int],
*,
token: str
) -> None:
"""Mark the provided notifications as read for the authenticated user."""
if not ids:
raise ValueError(
"At least one notification identifier must be provided to mark as read."
)
sanitized_ids: list[int] = []
for value in ids:
if isinstance(value, bool):
raise ValueError("Notification identifiers must be integers, not booleans.")
try:
converted = int(value)
except (TypeError, ValueError) as exc: # pragma: no cover - defensive
raise ValueError(
"Notification identifiers must be integers."
) from exc
if converted <= 0:
raise ValueError(
"Notification identifiers must be positive integers."
)
sanitized_ids.append(converted)
client = self._get_client()
resolved_token = self._require_token(token)
logger.debug(
"Marking %d notifications as read: ids=%s",
len(sanitized_ids),
sanitized_ids,
)
response = await client.post(
"/api/notifications/read",
json={"ids": sanitized_ids},
headers=self._build_headers(token=resolved_token, include_json=True),
)
response.raise_for_status()
logger.info(
"Successfully marked %d notifications as read.",
len(sanitized_ids),
)
async def aclose(self) -> None:
"""Dispose of the underlying HTTP client."""
if self._client is not None:
await self._client.aclose()
self._client = None
logger.debug("Closed httpx.AsyncClient for SearchClient.")
@staticmethod
def _ensure_dict(entry: Any) -> dict[str, Any]:

View File

@@ -2,6 +2,7 @@
from __future__ import annotations
import logging
from contextlib import asynccontextmanager
from typing import Annotated
@@ -12,8 +13,14 @@ from pydantic import Field as PydanticField
from .config import get_settings
from .schemas import (
CommentCreateResult,
CommentData,
CommentReplyResult,
NotificationData,
NotificationCleanupResult,
UnreadNotificationsResponse,
PostDetail,
PostCreateResult,
PostSummary,
RecentPostsResponse,
SearchResponse,
@@ -22,26 +29,79 @@ from .schemas import (
from .search_client import SearchClient
settings = get_settings()
if not logging.getLogger().handlers:
logging.basicConfig(
level=getattr(logging, settings.log_level.upper(), logging.INFO),
format="%(asctime)s | %(levelname)s | %(name)s | %(message)s",
)
else:
logging.getLogger().setLevel(
getattr(logging, settings.log_level.upper(), logging.INFO)
)
logger = logging.getLogger(__name__)
search_client = SearchClient(
str(settings.backend_base_url), timeout=settings.request_timeout
str(settings.backend_base_url),
timeout=settings.request_timeout,
access_token=(
settings.access_token.get_secret_value()
if settings.access_token is not None
else None
),
)
def _extract_authorization_token(ctx: Context | None) -> str | None:
"""Return the Bearer token from the incoming MCP request headers."""
if ctx is None:
return None
try:
request_context = ctx.request_context
except ValueError:
return None
request = getattr(request_context, "request", None)
if request is None:
return None
headers = getattr(request, "headers", None)
if headers is None:
return None
authorization = headers.get("authorization")
if not authorization:
return None
scheme, _, token = authorization.partition(" ")
if scheme.lower() != "bearer":
return None
stripped = token.strip()
return stripped or None
@asynccontextmanager
async def lifespan(_: FastMCP):
"""Lifecycle hook that disposes shared resources when the server stops."""
try:
logger.debug("OpenIsle MCP server lifespan started.")
yield
finally:
logger.debug("Disposing shared SearchClient instance.")
await search_client.aclose()
app = FastMCP(
name="openisle-mcp",
instructions=(
"Use this server to search OpenIsle content, reply to comments with an authentication "
"token, and list posts created within a recent time window."
"Use this server to search OpenIsle content, create new posts, reply to posts and "
"comments using the Authorization header or configured access token, retrieve details "
"for a specific post, list posts created within a recent time window, and review "
"unread notification messages."
),
host=settings.host,
port=settings.port,
@@ -65,6 +125,7 @@ async def search(
raise ValueError("Keyword must not be empty.")
try:
logger.info("Received search request for keyword='%s'", sanitized)
raw_results = await search_client.global_search(sanitized)
except httpx.HTTPStatusError as exc: # pragma: no cover - network errors
message = (
@@ -90,13 +151,122 @@ async def search(
if ctx is not None:
await ctx.info(f"Search keyword '{sanitized}' returned {len(results)} results.")
logger.debug(
"Validated %d search results for keyword='%s'",
len(results),
sanitized,
)
return SearchResponse(keyword=sanitized, total=len(results), results=results)
@app.tool(
name="reply_to_post",
description=(
"Create a comment on a post using the request Authorization header or the configured "
"access token."
),
structured_output=True,
)
async def reply_to_post(
post_id: Annotated[
int,
PydanticField(ge=1, description="Identifier of the post being replied to."),
],
content: Annotated[
str,
PydanticField(description="Markdown content of the reply."),
],
captcha: Annotated[
str | None,
PydanticField(
default=None,
description="Optional captcha solution if the backend requires it.",
),
] = None,
ctx: Context | None = None,
) -> CommentCreateResult:
"""Create a comment on a post and return the backend payload."""
sanitized_content = content.strip()
if not sanitized_content:
raise ValueError("Reply content must not be empty.")
sanitized_captcha = captcha.strip() if isinstance(captcha, str) else None
request_token = _extract_authorization_token(ctx)
try:
logger.info(
"Creating reply for post_id=%s (captcha=%s)",
post_id,
bool(sanitized_captcha),
)
raw_comment = await search_client.reply_to_post(
post_id,
token=request_token,
content=sanitized_content,
captcha=sanitized_captcha,
)
except httpx.HTTPStatusError as exc: # pragma: no cover - network errors
status_code = exc.response.status_code
if status_code == 401:
message = (
"Authentication failed while replying to post "
f"{post_id}. Please verify the Authorization header or configured token."
)
elif status_code == 403:
message = (
"The provided Authorization token is not authorized to reply to post "
f"{post_id}."
)
elif status_code == 404:
message = f"Post {post_id} was not found."
else:
message = (
"OpenIsle backend returned HTTP "
f"{status_code} while replying to post {post_id}."
)
if ctx is not None:
await ctx.error(message)
raise ValueError(message) from exc
except httpx.RequestError as exc: # pragma: no cover - network errors
message = (
"Unable to reach OpenIsle backend comment service: "
f"{exc}."
)
if ctx is not None:
await ctx.error(message)
raise ValueError(message) from exc
try:
comment = CommentData.model_validate(raw_comment)
except ValidationError as exc:
message = "Received malformed data from the post comment endpoint."
if ctx is not None:
await ctx.error(message)
raise ValueError(message) from exc
if ctx is not None:
await ctx.info(
"Reply created successfully for post "
f"{post_id}."
)
logger.debug(
"Validated reply comment payload for post_id=%s (comment_id=%s)",
post_id,
comment.id,
)
return CommentCreateResult(comment=comment)
@app.tool(
name="reply_to_comment",
description="Reply to an existing comment using an authentication token.",
description=(
"Reply to an existing comment using the request Authorization header or the configured "
"access token."
),
structured_output=True,
)
async def reply_to_comment(
@@ -104,7 +274,6 @@ async def reply_to_comment(
int,
PydanticField(ge=1, description="Identifier of the comment being replied to."),
],
token: Annotated[str, PydanticField(description="JWT bearer token for the user performing the reply.")],
content: Annotated[
str,
PydanticField(description="Markdown content of the reply."),
@@ -124,29 +293,32 @@ async def reply_to_comment(
if not sanitized_content:
raise ValueError("Reply content must not be empty.")
sanitized_token = token.strip()
if not sanitized_token:
raise ValueError("Authentication token must not be empty.")
sanitized_captcha = captcha.strip() if isinstance(captcha, str) else None
request_token = _extract_authorization_token(ctx)
try:
logger.info(
"Creating reply for comment_id=%s (captcha=%s)",
comment_id,
bool(sanitized_captcha),
)
raw_comment = await search_client.reply_to_comment(
comment_id,
sanitized_token,
sanitized_content,
sanitized_captcha,
token=request_token,
content=sanitized_content,
captcha=sanitized_captcha,
)
except httpx.HTTPStatusError as exc: # pragma: no cover - network errors
status_code = exc.response.status_code
if status_code == 401:
message = (
"Authentication failed while replying to comment "
f"{comment_id}. Please verify the token."
f"{comment_id}. Please verify the Authorization header or configured token."
)
elif status_code == 403:
message = (
"The provided token is not authorized to reply to comment "
"The provided Authorization token is not authorized to reply to comment "
f"{comment_id}."
)
else:
@@ -179,10 +351,351 @@ async def reply_to_comment(
"Reply created successfully for comment "
f"{comment_id}."
)
logger.debug(
"Validated reply payload for comment_id=%s (reply_id=%s)",
comment_id,
comment.id,
)
return CommentReplyResult(comment=comment)
@app.tool(
name="create_post",
description=(
"Publish a new post using the request Authorization header or the configured access "
"token."
),
structured_output=True,
)
async def create_post(
title: Annotated[
str,
PydanticField(description="Title of the post to be created."),
],
content: Annotated[
str,
PydanticField(description="Markdown content of the post."),
],
category_id: Annotated[
int | None,
PydanticField(
default=None,
ge=1,
description="Optional category identifier for the post.",
),
] = None,
tag_ids: Annotated[
list[int] | None,
PydanticField(
default=None,
min_length=1,
description="Optional list of tag identifiers to assign to the post.",
),
] = None,
post_type: Annotated[
str | None,
PydanticField(
default=None,
description="Optional post type value (e.g. LOTTERY, POLL).",
),
] = None,
visible_scope: Annotated[
str | None,
PydanticField(
default=None,
description="Optional visibility scope for the post.",
),
] = None,
prize_description: Annotated[
str | None,
PydanticField(
default=None,
description="Description of the prize for lottery posts.",
),
] = None,
prize_icon: Annotated[
str | None,
PydanticField(
default=None,
description="Icon URL for the lottery prize.",
),
] = None,
prize_count: Annotated[
int | None,
PydanticField(
default=None,
ge=1,
description="Total number of prizes available for lottery posts.",
),
] = None,
point_cost: Annotated[
int | None,
PydanticField(
default=None,
ge=0,
description="Point cost required to participate in the post, when applicable.",
),
] = None,
start_time: Annotated[
str | None,
PydanticField(
default=None,
description="ISO 8601 start time for lottery or poll posts.",
),
] = None,
end_time: Annotated[
str | None,
PydanticField(
default=None,
description="ISO 8601 end time for lottery or poll posts.",
),
] = None,
options: Annotated[
list[str] | None,
PydanticField(
default=None,
min_length=1,
description="Poll options when creating a poll post.",
),
] = None,
multiple: Annotated[
bool | None,
PydanticField(
default=None,
description="Whether the poll allows selecting multiple options.",
),
] = None,
proposed_name: Annotated[
str | None,
PydanticField(
default=None,
description="Proposed category name for suggestion posts.",
),
] = None,
proposal_description: Annotated[
str | None,
PydanticField(
default=None,
description="Supporting description for the proposed category.",
),
] = None,
captcha: Annotated[
str | None,
PydanticField(
default=None,
description="Captcha solution if the backend requires one to create posts.",
),
] = None,
ctx: Context | None = None,
) -> PostCreateResult:
"""Create a new post in OpenIsle and return the detailed backend payload."""
sanitized_title = title.strip()
if not sanitized_title:
raise ValueError("Post title must not be empty.")
sanitized_content = content.strip()
if not sanitized_content:
raise ValueError("Post content must not be empty.")
sanitized_category_id: int | None = None
if category_id is not None:
if isinstance(category_id, bool):
raise ValueError("Category identifier must be an integer, not a boolean.")
try:
sanitized_category_id = int(category_id)
except (TypeError, ValueError) as exc:
raise ValueError("Category identifier must be an integer.") from exc
if sanitized_category_id <= 0:
raise ValueError("Category identifier must be a positive integer.")
if sanitized_category_id is None:
raise ValueError("A category identifier is required to create a post.")
sanitized_tag_ids: list[int] | None = None
if tag_ids is not None:
sanitized_tag_ids = []
for value in tag_ids:
if isinstance(value, bool):
raise ValueError("Tag identifiers must be integers, not booleans.")
try:
converted = int(value)
except (TypeError, ValueError) as exc:
raise ValueError("Tag identifiers must be integers.") from exc
if converted <= 0:
raise ValueError("Tag identifiers must be positive integers.")
sanitized_tag_ids.append(converted)
if not sanitized_tag_ids:
sanitized_tag_ids = None
if not sanitized_tag_ids:
raise ValueError("At least one tag identifier is required to create a post.")
if len(sanitized_tag_ids) > 2:
raise ValueError("At most two tag identifiers can be provided for a post.")
sanitized_post_type = post_type.strip() if isinstance(post_type, str) else None
if sanitized_post_type == "":
sanitized_post_type = None
sanitized_visible_scope = (
visible_scope.strip() if isinstance(visible_scope, str) else None
)
if sanitized_visible_scope == "":
sanitized_visible_scope = None
sanitized_prize_description = (
prize_description.strip() if isinstance(prize_description, str) else None
)
if sanitized_prize_description == "":
sanitized_prize_description = None
sanitized_prize_icon = prize_icon.strip() if isinstance(prize_icon, str) else None
if sanitized_prize_icon == "":
sanitized_prize_icon = None
sanitized_prize_count: int | None = None
if prize_count is not None:
if isinstance(prize_count, bool):
raise ValueError("Prize count must be an integer, not a boolean.")
try:
sanitized_prize_count = int(prize_count)
except (TypeError, ValueError) as exc:
raise ValueError("Prize count must be an integer.") from exc
if sanitized_prize_count <= 0:
raise ValueError("Prize count must be a positive integer.")
sanitized_point_cost: int | None = None
if point_cost is not None:
if isinstance(point_cost, bool):
raise ValueError("Point cost must be an integer, not a boolean.")
try:
sanitized_point_cost = int(point_cost)
except (TypeError, ValueError) as exc:
raise ValueError("Point cost must be an integer.") from exc
if sanitized_point_cost < 0:
raise ValueError("Point cost cannot be negative.")
sanitized_start_time = start_time.strip() if isinstance(start_time, str) else None
if sanitized_start_time == "":
sanitized_start_time = None
sanitized_end_time = end_time.strip() if isinstance(end_time, str) else None
if sanitized_end_time == "":
sanitized_end_time = None
sanitized_options: list[str] | None = None
if options is not None:
sanitized_options = []
for option in options:
if option is None:
continue
stripped_option = option.strip()
if stripped_option:
sanitized_options.append(stripped_option)
if not sanitized_options:
sanitized_options = None
sanitized_multiple = bool(multiple) if isinstance(multiple, bool) else None
sanitized_proposed_name = (
proposed_name.strip() if isinstance(proposed_name, str) else None
)
if sanitized_proposed_name == "":
sanitized_proposed_name = None
sanitized_proposal_description = (
proposal_description.strip() if isinstance(proposal_description, str) else None
)
if sanitized_proposal_description == "":
sanitized_proposal_description = None
sanitized_captcha = captcha.strip() if isinstance(captcha, str) else None
if sanitized_captcha == "":
sanitized_captcha = None
payload: dict[str, object] = {
"title": sanitized_title,
"content": sanitized_content,
}
if sanitized_category_id is not None:
payload["categoryId"] = sanitized_category_id
if sanitized_tag_ids is not None:
payload["tagIds"] = sanitized_tag_ids
if sanitized_post_type is not None:
payload["type"] = sanitized_post_type
if sanitized_visible_scope is not None:
payload["postVisibleScopeType"] = sanitized_visible_scope
if sanitized_prize_description is not None:
payload["prizeDescription"] = sanitized_prize_description
if sanitized_prize_icon is not None:
payload["prizeIcon"] = sanitized_prize_icon
if sanitized_prize_count is not None:
payload["prizeCount"] = sanitized_prize_count
if sanitized_point_cost is not None:
payload["pointCost"] = sanitized_point_cost
if sanitized_start_time is not None:
payload["startTime"] = sanitized_start_time
if sanitized_end_time is not None:
payload["endTime"] = sanitized_end_time
if sanitized_options is not None:
payload["options"] = sanitized_options
if sanitized_multiple is not None:
payload["multiple"] = sanitized_multiple
if sanitized_proposed_name is not None:
payload["proposedName"] = sanitized_proposed_name
if sanitized_proposal_description is not None:
payload["proposalDescription"] = sanitized_proposal_description
if sanitized_captcha is not None:
payload["captcha"] = sanitized_captcha
try:
logger.info("Creating post with title='%s'", sanitized_title)
raw_post = await search_client.create_post(payload, token=_extract_authorization_token(ctx))
except httpx.HTTPStatusError as exc: # pragma: no cover - network errors
status_code = exc.response.status_code
if status_code == 400:
message = (
"Post creation failed due to invalid input or captcha verification errors."
)
elif status_code == 401:
message = (
"Authentication failed while creating the post. Please verify the "
"Authorization header or configured token."
)
elif status_code == 403:
message = "The provided Authorization token is not authorized to create posts."
else:
message = (
"OpenIsle backend returned HTTP "
f"{status_code} while creating the post."
)
if ctx is not None:
await ctx.error(message)
raise ValueError(message) from exc
except httpx.RequestError as exc: # pragma: no cover - network errors
message = f"Unable to reach OpenIsle backend post service: {exc}."
if ctx is not None:
await ctx.error(message)
raise ValueError(message) from exc
try:
post = PostDetail.model_validate(raw_post)
except ValidationError as exc:
message = "Received malformed data from the post creation endpoint."
if ctx is not None:
await ctx.error(message)
raise ValueError(message) from exc
if ctx is not None:
await ctx.info(f"Post '{post.title}' created successfully.")
logger.debug(
"Validated created post payload with id=%s and title='%s'",
post.id,
post.title,
)
return PostCreateResult(post=post)
@app.tool(
name="recent_posts",
description="Retrieve posts created in the last N minutes.",
@@ -198,6 +711,7 @@ async def recent_posts(
"""Fetch recent posts from the backend and return structured data."""
try:
logger.info("Fetching recent posts for last %s minutes", minutes)
raw_posts = await search_client.recent_posts(minutes)
except httpx.HTTPStatusError as exc: # pragma: no cover - network errors
message = (
@@ -225,10 +739,219 @@ async def recent_posts(
await ctx.info(
f"Found {len(posts)} posts created within the last {minutes} minutes."
)
logger.debug(
"Validated %d recent posts for window=%s minutes",
len(posts),
minutes,
)
return RecentPostsResponse(minutes=minutes, total=len(posts), posts=posts)
@app.tool(
name="get_post",
description="Retrieve detailed information for a single post.",
structured_output=True,
)
async def get_post(
post_id: Annotated[
int,
PydanticField(ge=1, description="Identifier of the post to retrieve."),
],
ctx: Context | None = None,
) -> PostDetail:
"""Fetch post details from the backend and validate the response."""
try:
logger.info("Fetching post details for post_id=%s", post_id)
raw_post = await search_client.get_post(
post_id, _extract_authorization_token(ctx)
)
except httpx.HTTPStatusError as exc: # pragma: no cover - network errors
status_code = exc.response.status_code
if status_code == 404:
message = f"Post {post_id} was not found."
elif status_code == 401:
message = "Authentication failed while retrieving the post."
elif status_code == 403:
message = "The provided Authorization token is not authorized to view this post."
else:
message = (
"OpenIsle backend returned HTTP "
f"{status_code} while retrieving post {post_id}."
)
if ctx is not None:
await ctx.error(message)
raise ValueError(message) from exc
except httpx.RequestError as exc: # pragma: no cover - network errors
message = f"Unable to reach OpenIsle backend post service: {exc}."
if ctx is not None:
await ctx.error(message)
raise ValueError(message) from exc
try:
post = PostDetail.model_validate(raw_post)
except ValidationError as exc:
message = "Received malformed data from the post detail endpoint."
if ctx is not None:
await ctx.error(message)
raise ValueError(message) from exc
if ctx is not None:
await ctx.info(f"Retrieved post {post_id} successfully.")
logger.debug(
"Validated post payload for post_id=%s with %d comments",
post_id,
len(post.comments),
)
return post
@app.tool(
name="list_unread_messages",
description="List unread notification messages for the authenticated user.",
structured_output=True,
)
async def list_unread_messages(
page: Annotated[
int,
PydanticField(
default=0,
ge=0,
description="Page number of unread notifications to retrieve.",
),
] = 0,
size: Annotated[
int,
PydanticField(
default=30,
ge=1,
le=100,
description="Number of unread notifications to include per page.",
),
] = 30,
ctx: Context | None = None,
) -> UnreadNotificationsResponse:
"""Retrieve unread notifications and return structured data."""
try:
logger.info(
"Fetching unread notifications (page=%s, size=%s)",
page,
size,
)
raw_notifications = await search_client.list_unread_notifications(
page=page,
size=size,
token=_extract_authorization_token(ctx),
)
except httpx.HTTPStatusError as exc: # pragma: no cover - network errors
message = (
"OpenIsle backend returned HTTP "
f"{exc.response.status_code} while fetching unread notifications."
)
if ctx is not None:
await ctx.error(message)
raise ValueError(message) from exc
except httpx.RequestError as exc: # pragma: no cover - network errors
message = f"Unable to reach OpenIsle backend notification service: {exc}."
if ctx is not None:
await ctx.error(message)
raise ValueError(message) from exc
try:
notifications = [
NotificationData.model_validate(entry) for entry in raw_notifications
]
except ValidationError as exc:
message = "Received malformed data from the unread notifications endpoint."
if ctx is not None:
await ctx.error(message)
raise ValueError(message) from exc
total = len(notifications)
if ctx is not None:
await ctx.info(
f"Retrieved {total} unread notifications (page {page}, size {size})."
)
logger.debug(
"Validated %d unread notifications for page=%s size=%s",
total,
page,
size,
)
return UnreadNotificationsResponse(
page=page,
size=size,
total=total,
notifications=notifications,
)
@app.tool(
name="mark_notifications_read",
description="Mark specific notification messages as read to remove them from the unread list.",
structured_output=True,
)
async def mark_notifications_read(
ids: Annotated[
list[int],
PydanticField(
min_length=1,
description="Notification identifiers that should be marked as read.",
),
],
ctx: Context | None = None,
) -> NotificationCleanupResult:
"""Mark the supplied notifications as read and report the processed identifiers."""
try:
logger.info(
"Marking %d notifications as read", # pragma: no branch - logging
len(ids),
)
await search_client.mark_notifications_read(
ids, token=_extract_authorization_token(ctx)
)
except httpx.HTTPStatusError as exc: # pragma: no cover - network errors
message = (
"OpenIsle backend returned HTTP "
f"{exc.response.status_code} while marking notifications as read."
)
if ctx is not None:
await ctx.error(message)
raise ValueError(message) from exc
except httpx.RequestError as exc: # pragma: no cover - network errors
message = f"Unable to reach OpenIsle backend notification service: {exc}."
if ctx is not None:
await ctx.error(message)
raise ValueError(message) from exc
processed_ids: list[int] = []
for value in ids:
if isinstance(value, bool):
raise ValueError("Notification identifiers must be integers, not booleans.")
converted = int(value)
if converted <= 0:
raise ValueError("Notification identifiers must be positive integers.")
processed_ids.append(converted)
if ctx is not None:
await ctx.info(
f"Marked {len(processed_ids)} notifications as read.",
)
logger.debug(
"Successfully marked notifications as read: ids=%s",
processed_ids,
)
return NotificationCleanupResult(
processed_ids=processed_ids,
total_marked=len(processed_ids),
)
def main() -> None:
"""Run the MCP server using the configured transport."""

View File

@@ -0,0 +1,48 @@
# WebSocket Service 协作指引
## 1) 适用范围
- 作用于 `websocket_service/` 目录及其子目录。
- 本服务是实时通知链路关键节点,改动需谨慎。
## 2) 服务职责
- 通过 STOMP 维护客户端实时连接(`/api/ws``/api/sockjs`)。
- 从 RabbitMQ 队列消费通知并转发至用户/会话目的地。
- 在连接阶段执行 JWT 鉴权。
## 3) 关键一致性规则
- JWT 密钥与后端保持一致(同一 `JWT_SECRET` 语义)。
- 队列配置与后端分片策略同步:
- 后端声明:`backend/.../RabbitMQConfig.java`
- 后端分片:`backend/.../ShardingStrategy.java`
- 本服务监听:`src/main/java/com/openisle/websocket/listener/NotificationListener.java`
- 监听队列当前约定16 个十六进制分片队列 + 遗留 `notifications-queue`
## 4) 修改规则
- 不随意变更 STOMP 目的地命名(`/topic/...``/user/...`)。
- 若必须调整目的地,需同步前端 `frontend_nuxt/composables/useWebSocket.js` 与相关消费代码。
- `WebSocketAuthInterceptor` 中 CONNECT 鉴权失败策略(拒绝连接)应保持清晰一致。
- Allowed origins 改动需考虑本地、预发、正式环境域名。
## 5) 配置与可观测性
- 配置入口:`src/main/resources/application.properties`
- 健康检查:`/actuator/health`(部署与 compose 依赖该路径)
- 日志级别改动需避免在生产产生高噪声。
## 6) 验证建议
- 首选:`mvn test`
- 若暂无测试覆盖:`mvn -DskipTests compile`
- 变更消息推送逻辑时,至少完成一次端到端验证:
- 生产者发送消息
- RabbitMQ 消费成功
- 客户端收到对应目的地消息
## 7) 输出要求
- 说明是否影响队列名、路由键、目的地或鉴权逻辑。
- 说明是否需要前端/后端同步改动。