mirror of
https://github.com/nagisa77/OpenIsle.git
synced 2026-02-20 22:11:01 +08:00
Compare commits
1 Commits
codex/crea
...
codex/crea
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
28efd376b6 |
1
.gitignore
vendored
1
.gitignore
vendored
@@ -17,6 +17,7 @@ dist
|
||||
|
||||
# misc
|
||||
.DS_Store
|
||||
__pycache__/
|
||||
*.pem
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
|
||||
@@ -213,28 +213,30 @@ services:
|
||||
- dev_local_backend
|
||||
- prod
|
||||
|
||||
mcp-server:
|
||||
mcp-service:
|
||||
build:
|
||||
context: ..
|
||||
dockerfile: docker/mcp-service.Dockerfile
|
||||
dockerfile: mcp/Dockerfile
|
||||
container_name: ${COMPOSE_PROJECT_NAME}-openisle-mcp
|
||||
env_file:
|
||||
- ${ENV_FILE:-../.env}
|
||||
environment:
|
||||
OPENISLE_API_BASE_URL: ${OPENISLE_API_BASE_URL:-http://springboot:8080}
|
||||
OPENISLE_MCP_HOST: ${OPENISLE_MCP_HOST:-0.0.0.0}
|
||||
OPENISLE_MCP_PORT: ${OPENISLE_MCP_PORT:-8000}
|
||||
OPENISLE_MCP_TRANSPORT: ${OPENISLE_MCP_TRANSPORT:-streamable-http}
|
||||
FASTMCP_HOST: 0.0.0.0
|
||||
FASTMCP_PORT: ${MCP_PORT:-8765}
|
||||
OPENISLE_BACKEND_URL: ${OPENISLE_BACKEND_URL:-http://springboot:8080}
|
||||
OPENISLE_BACKEND_TIMEOUT: ${OPENISLE_BACKEND_TIMEOUT:-10}
|
||||
OPENISLE_MCP_TRANSPORT: ${OPENISLE_MCP_TRANSPORT:-sse}
|
||||
OPENISLE_MCP_SSE_MOUNT_PATH: ${OPENISLE_MCP_SSE_MOUNT_PATH:-/mcp}
|
||||
ports:
|
||||
- "${OPENISLE_MCP_PORT:-8000}:8000"
|
||||
- "${MCP_PORT:-8765}:${MCP_PORT:-8765}"
|
||||
depends_on:
|
||||
springboot:
|
||||
condition: service_healthy
|
||||
restart: unless-stopped
|
||||
networks:
|
||||
- openisle-network
|
||||
profiles:
|
||||
- dev
|
||||
- dev_local_backend
|
||||
- prod
|
||||
|
||||
frontend_dev:
|
||||
|
||||
@@ -1,20 +0,0 @@
|
||||
FROM python:3.11-slim AS base
|
||||
|
||||
ENV PYTHONUNBUFFERED=1 \
|
||||
PIP_NO_CACHE_DIR=1
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
COPY mcp/pyproject.toml mcp/README.md ./
|
||||
COPY mcp/src ./src
|
||||
RUN pip install --upgrade pip \
|
||||
&& pip install .
|
||||
|
||||
EXPOSE 8000
|
||||
|
||||
ENV OPENISLE_API_BASE_URL=http://springboot:8080 \
|
||||
OPENISLE_MCP_HOST=0.0.0.0 \
|
||||
OPENISLE_MCP_PORT=8000 \
|
||||
OPENISLE_MCP_TRANSPORT=streamable-http
|
||||
|
||||
CMD ["openisle-mcp"]
|
||||
6
mcp/.gitignore
vendored
6
mcp/.gitignore
vendored
@@ -1,6 +0,0 @@
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
*.egg-info/
|
||||
.build/
|
||||
.venv/
|
||||
.env
|
||||
17
mcp/Dockerfile
Normal file
17
mcp/Dockerfile
Normal file
@@ -0,0 +1,17 @@
|
||||
FROM python:3.11-slim AS runtime
|
||||
|
||||
ENV PYTHONUNBUFFERED=1 \
|
||||
PIP_NO_CACHE_DIR=1
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
COPY mcp/pyproject.toml /app/pyproject.toml
|
||||
COPY mcp/README.md /app/README.md
|
||||
COPY mcp/src /app/src
|
||||
|
||||
RUN pip install --upgrade pip \
|
||||
&& pip install .
|
||||
|
||||
EXPOSE 8765
|
||||
|
||||
CMD ["openisle-mcp"]
|
||||
@@ -1,45 +1,39 @@
|
||||
# OpenIsle MCP Server
|
||||
|
||||
This package exposes a [Model Context Protocol](https://github.com/modelcontextprotocol) (MCP) server for OpenIsle.
|
||||
The initial release focuses on surfacing the platform's search capabilities so that AI assistants can discover
|
||||
users and posts directly through the existing REST API. Future iterations can expand this service with post
|
||||
creation and other productivity tools.
|
||||
This package provides a [Model Context Protocol](https://github.com/modelcontextprotocol) (MCP) server that exposes the OpenIsle
|
||||
search capabilities to AI assistants. The server wraps the existing Spring Boot backend and currently provides a single `search`
|
||||
tool. Future iterations can extend the server with additional functionality such as publishing new posts or moderating content.
|
||||
|
||||
## Features
|
||||
|
||||
- 🔍 Keyword search across users and posts using the OpenIsle backend APIs
|
||||
- ✅ Structured MCP tool response for downstream reasoning
|
||||
- 🩺 Lightweight health check endpoint (`/health`) for container orchestration
|
||||
- ⚙️ Configurable via environment variables with sensible defaults for Docker Compose
|
||||
- 🔍 **Global search** — delegates to the existing `/api/search/global` endpoint exposed by the OpenIsle backend.
|
||||
- 🧠 **Structured results** — responses include highlights and deep links so AI clients can present the results cleanly.
|
||||
- ⚙️ **Configurable** — point the server at any reachable OpenIsle backend by setting environment variables.
|
||||
|
||||
## Running locally
|
||||
## Local development
|
||||
|
||||
```bash
|
||||
cd mcp
|
||||
pip install .
|
||||
openisle-mcp # starts the MCP server on http://127.0.0.1:8000 by default
|
||||
python -m venv .venv
|
||||
source .venv/bin/activate
|
||||
pip install -e .
|
||||
openisle-mcp --transport stdio # or "sse"/"streamable-http"
|
||||
```
|
||||
|
||||
By default the server targets `http://localhost:8080` for backend requests. Override the target by setting
|
||||
`OPENISLE_API_BASE_URL` before starting the service.
|
||||
Environment variables:
|
||||
|
||||
## Environment variables
|
||||
|
||||
| Variable | Default | Description |
|
||||
| -------- | ------- | ----------- |
|
||||
| `OPENISLE_API_BASE_URL` | `http://localhost:8080` | Base URL of the OpenIsle backend API |
|
||||
| `OPENISLE_MCP_HOST` | `127.0.0.1` | Hostname/interface for the MCP HTTP server |
|
||||
| `OPENISLE_MCP_PORT` | `8000` | Port for the MCP HTTP server |
|
||||
| `OPENISLE_MCP_TRANSPORT` | `streamable-http` | Transport mode (`stdio`, `sse`, or `streamable-http`) |
|
||||
| `OPENISLE_MCP_TIMEOUT_SECONDS` | `10` | HTTP timeout when calling the backend |
|
||||
| Variable | Description | Default |
|
||||
| --- | --- | --- |
|
||||
| `OPENISLE_BACKEND_URL` | Base URL of the Spring Boot backend | `http://springboot:8080` |
|
||||
| `OPENISLE_BACKEND_TIMEOUT` | Timeout (seconds) for backend HTTP calls | `10` |
|
||||
| `OPENISLE_PUBLIC_BASE_URL` | Optional base URL used to build deep links in search results | *(unset)* |
|
||||
| `OPENISLE_MCP_TRANSPORT` | MCP transport (`stdio`, `sse`, `streamable-http`) | `stdio` |
|
||||
| `OPENISLE_MCP_SSE_MOUNT_PATH` | Mount path when using SSE transport | `/mcp` |
|
||||
| `FASTMCP_HOST` | Host for SSE / HTTP transports | `127.0.0.1` |
|
||||
| `FASTMCP_PORT` | Port for SSE / HTTP transports | `8000` |
|
||||
|
||||
## Docker
|
||||
|
||||
The repository's Docker Compose stack now includes the MCP server. To start it alongside other services:
|
||||
A dedicated Docker image is provided and wired into `docker-compose.yaml`. The container listens on
|
||||
`${MCP_PORT:-8765}` and connects to the backend service running in the same compose stack.
|
||||
|
||||
```bash
|
||||
cd docker
|
||||
docker compose --profile dev up mcp-server
|
||||
```
|
||||
|
||||
The service exposes port `8000` by default. Update `OPENISLE_MCP_PORT` to customize the mapped port.
|
||||
|
||||
@@ -5,22 +5,23 @@ build-backend = "setuptools.build_meta"
|
||||
[project]
|
||||
name = "openisle-mcp"
|
||||
version = "0.1.0"
|
||||
description = "Model Context Protocol server exposing OpenIsle search capabilities."
|
||||
description = "Model Context Protocol server exposing OpenIsle search capabilities"
|
||||
readme = "README.md"
|
||||
authors = [{name = "OpenIsle Team"}]
|
||||
license = {text = "MIT"}
|
||||
requires-python = ">=3.11"
|
||||
authors = [{ name = "OpenIsle" }]
|
||||
dependencies = [
|
||||
"mcp>=1.19.0",
|
||||
"httpx>=0.28.1",
|
||||
"pydantic>=2.7.0"
|
||||
"httpx>=0.28.0",
|
||||
"pydantic>=2.12.0",
|
||||
]
|
||||
|
||||
[project.urls]
|
||||
Homepage = "https://github.com/openisle/openisle"
|
||||
|
||||
[project.scripts]
|
||||
openisle-mcp = "openisle_mcp.server:main"
|
||||
|
||||
[tool.setuptools]
|
||||
package-dir = {"" = "src"}
|
||||
|
||||
[tool.setuptools.packages.find]
|
||||
where = ["src"]
|
||||
|
||||
|
||||
@@ -1,14 +1,10 @@
|
||||
"""OpenIsle MCP server package."""
|
||||
|
||||
from .config import Settings, get_settings
|
||||
from .models import SearchItem, SearchResponse, SearchScope
|
||||
from importlib import metadata
|
||||
|
||||
__all__ = [
|
||||
"Settings",
|
||||
"get_settings",
|
||||
"SearchItem",
|
||||
"SearchResponse",
|
||||
"SearchScope",
|
||||
]
|
||||
try:
|
||||
__version__ = metadata.version("openisle-mcp")
|
||||
except metadata.PackageNotFoundError: # pragma: no cover - best effort during dev
|
||||
__version__ = "0.0.0"
|
||||
|
||||
__version__ = "0.1.0"
|
||||
__all__ = ["__version__"]
|
||||
|
||||
@@ -1,33 +1,79 @@
|
||||
"""HTTP client helpers for interacting with the OpenIsle backend APIs."""
|
||||
"""HTTP client for talking to the OpenIsle backend."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any
|
||||
import json
|
||||
import logging
|
||||
from typing import List
|
||||
|
||||
import httpx
|
||||
from pydantic import ValidationError
|
||||
|
||||
from .config import Settings, get_settings
|
||||
from .models import SearchScope
|
||||
from .models import BackendSearchResult
|
||||
|
||||
__all__ = ["BackendClientError", "OpenIsleBackendClient"]
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class OpenIsleAPI:
|
||||
"""Thin wrapper around the OpenIsle REST API used by the MCP server."""
|
||||
class BackendClientError(RuntimeError):
|
||||
"""Raised when the backend cannot fulfil a request."""
|
||||
|
||||
def __init__(self, settings: Settings | None = None) -> None:
|
||||
self._settings = settings or get_settings()
|
||||
|
||||
async def search(self, scope: SearchScope, keyword: str) -> list[Any]:
|
||||
"""Execute a search request against the backend API."""
|
||||
class OpenIsleBackendClient:
|
||||
"""Tiny wrapper around the Spring Boot search endpoints."""
|
||||
|
||||
url_path = self._settings.get_search_path(scope)
|
||||
async with httpx.AsyncClient(
|
||||
base_url=str(self._settings.backend_base_url),
|
||||
timeout=self._settings.request_timeout_seconds,
|
||||
) as client:
|
||||
response = await client.get(url_path, params={"keyword": keyword})
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
def __init__(self, base_url: str, timeout: float = 10.0) -> None:
|
||||
if not base_url:
|
||||
raise ValueError("base_url must not be empty")
|
||||
self._base_url = base_url.rstrip("/")
|
||||
timeout = timeout if timeout > 0 else 10.0
|
||||
self._timeout = httpx.Timeout(timeout, connect=timeout, read=timeout)
|
||||
|
||||
if not isinstance(data, list):
|
||||
raise RuntimeError("Unexpected search response payload: expected a list")
|
||||
return data
|
||||
@property
|
||||
def base_url(self) -> str:
|
||||
return self._base_url
|
||||
|
||||
async def search_global(self, keyword: str) -> List[BackendSearchResult]:
|
||||
"""Call `/api/search/global` and normalise the payload."""
|
||||
|
||||
url = f"{self._base_url}/api/search/global"
|
||||
params = {"keyword": keyword}
|
||||
headers = {"Accept": "application/json"}
|
||||
logger.debug("Calling OpenIsle backend", extra={"url": url, "params": params})
|
||||
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=self._timeout, headers=headers, follow_redirects=True) as client:
|
||||
response = await client.get(url, params=params)
|
||||
response.raise_for_status()
|
||||
except httpx.HTTPStatusError as exc: # pragma: no cover - network errors are rare in tests
|
||||
body_preview = _truncate_body(exc.response.text)
|
||||
raise BackendClientError(
|
||||
f"Backend returned HTTP {exc.response.status_code}: {body_preview}"
|
||||
) from exc
|
||||
except httpx.RequestError as exc: # pragma: no cover - network errors are rare in tests
|
||||
raise BackendClientError(f"Failed to reach backend: {exc}") from exc
|
||||
|
||||
try:
|
||||
payload = response.json()
|
||||
except json.JSONDecodeError as exc:
|
||||
raise BackendClientError("Backend returned invalid JSON") from exc
|
||||
|
||||
if not isinstance(payload, list):
|
||||
raise BackendClientError("Unexpected search payload type; expected a list")
|
||||
|
||||
results: list[BackendSearchResult] = []
|
||||
for item in payload:
|
||||
try:
|
||||
results.append(BackendSearchResult.model_validate(item))
|
||||
except ValidationError as exc:
|
||||
raise BackendClientError(f"Invalid search result payload: {exc}") from exc
|
||||
|
||||
return results
|
||||
|
||||
|
||||
def _truncate_body(body: str, limit: int = 200) -> str:
|
||||
body = body.strip()
|
||||
if len(body) <= limit:
|
||||
return body
|
||||
return f"{body[:limit]}…"
|
||||
|
||||
@@ -1,83 +0,0 @@
|
||||
"""Configuration helpers for the OpenIsle MCP server."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import os
|
||||
from functools import lru_cache
|
||||
from typing import Dict, Literal
|
||||
|
||||
from pydantic import AnyHttpUrl, BaseModel, Field, ValidationError
|
||||
|
||||
from .models import SearchScope
|
||||
|
||||
TransportType = Literal["stdio", "sse", "streamable-http"]
|
||||
|
||||
|
||||
class Settings(BaseModel):
|
||||
"""Runtime configuration for the MCP server."""
|
||||
|
||||
backend_base_url: AnyHttpUrl = Field(
|
||||
default="http://localhost:8080",
|
||||
description="Base URL of the OpenIsle backend API.",
|
||||
)
|
||||
request_timeout_seconds: float = Field(
|
||||
default=10.0,
|
||||
gt=0,
|
||||
description="HTTP timeout when talking to the backend APIs.",
|
||||
)
|
||||
transport: TransportType = Field(
|
||||
default="streamable-http",
|
||||
description="Transport mode for the MCP server.",
|
||||
)
|
||||
host: str = Field(default="127.0.0.1", description="Hostname/interface used by the MCP HTTP server.")
|
||||
port: int = Field(default=8000, ge=0, description="Port used by the MCP HTTP server.")
|
||||
search_paths: Dict[str, str] = Field(
|
||||
default_factory=lambda: {
|
||||
SearchScope.GLOBAL.value: "/api/search/global",
|
||||
SearchScope.USERS.value: "/api/search/users",
|
||||
SearchScope.POSTS.value: "/api/search/posts",
|
||||
SearchScope.POSTS_TITLE.value: "/api/search/posts/title",
|
||||
SearchScope.POSTS_CONTENT.value: "/api/search/posts/content",
|
||||
},
|
||||
description="Mapping between search scopes and backend API paths.",
|
||||
)
|
||||
|
||||
def get_search_path(self, scope: SearchScope) -> str:
|
||||
"""Return the backend path associated with a given search scope."""
|
||||
|
||||
try:
|
||||
return self.search_paths[scope.value]
|
||||
except KeyError as exc: # pragma: no cover - defensive guard
|
||||
raise ValueError(f"Unsupported search scope: {scope}") from exc
|
||||
|
||||
|
||||
@lru_cache(maxsize=1)
|
||||
def get_settings() -> Settings:
|
||||
"""Load settings from environment variables with caching."""
|
||||
|
||||
raw_settings: Dict[str, object] = {}
|
||||
|
||||
backend_url = os.getenv("OPENISLE_API_BASE_URL")
|
||||
if backend_url:
|
||||
raw_settings["backend_base_url"] = backend_url
|
||||
|
||||
timeout = os.getenv("OPENISLE_MCP_TIMEOUT_SECONDS")
|
||||
if timeout:
|
||||
raw_settings["request_timeout_seconds"] = float(timeout)
|
||||
|
||||
transport = os.getenv("OPENISLE_MCP_TRANSPORT")
|
||||
if transport:
|
||||
raw_settings["transport"] = transport
|
||||
|
||||
host = os.getenv("OPENISLE_MCP_HOST")
|
||||
if host:
|
||||
raw_settings["host"] = host
|
||||
|
||||
port = os.getenv("OPENISLE_MCP_PORT")
|
||||
if port:
|
||||
raw_settings["port"] = int(port)
|
||||
|
||||
try:
|
||||
return Settings(**raw_settings)
|
||||
except (ValidationError, ValueError) as exc: # pragma: no cover - configuration errors should surface clearly
|
||||
raise RuntimeError(f"Invalid MCP configuration: {exc}") from exc
|
||||
@@ -1,45 +1,58 @@
|
||||
"""Data models for the OpenIsle MCP server."""
|
||||
"""Pydantic models used by the OpenIsle MCP server."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from enum import Enum
|
||||
from typing import Any, Dict, Optional
|
||||
from typing import Dict, Optional
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
from pydantic import BaseModel, ConfigDict, Field
|
||||
|
||||
__all__ = [
|
||||
"BackendSearchResult",
|
||||
"SearchResult",
|
||||
"SearchResponse",
|
||||
]
|
||||
|
||||
|
||||
class SearchScope(str, Enum):
|
||||
"""Supported search scopes exposed via the MCP tool."""
|
||||
class BackendSearchResult(BaseModel):
|
||||
"""Shape of the payload returned by the OpenIsle backend."""
|
||||
|
||||
GLOBAL = "global"
|
||||
USERS = "users"
|
||||
POSTS = "posts"
|
||||
POSTS_TITLE = "posts_title"
|
||||
POSTS_CONTENT = "posts_content"
|
||||
type: str
|
||||
id: Optional[int] = None
|
||||
text: Optional[str] = None
|
||||
sub_text: Optional[str] = Field(default=None, alias="subText")
|
||||
extra: Optional[str] = None
|
||||
post_id: Optional[int] = Field(default=None, alias="postId")
|
||||
highlighted_text: Optional[str] = Field(default=None, alias="highlightedText")
|
||||
highlighted_sub_text: Optional[str] = Field(default=None, alias="highlightedSubText")
|
||||
highlighted_extra: Optional[str] = Field(default=None, alias="highlightedExtra")
|
||||
|
||||
model_config = ConfigDict(populate_by_name=True, extra="ignore")
|
||||
|
||||
|
||||
class Highlight(BaseModel):
|
||||
"""Highlighted fragments returned by the backend search API."""
|
||||
class SearchResult(BaseModel):
|
||||
"""Structured search result returned to MCP clients."""
|
||||
|
||||
text: Optional[str] = Field(default=None, description="Highlighted main text snippet.")
|
||||
sub_text: Optional[str] = Field(default=None, description="Highlighted secondary text snippet.")
|
||||
extra: Optional[str] = Field(default=None, description="Additional highlighted data.")
|
||||
type: str = Field(description="Entity type, e.g. post, comment, user")
|
||||
id: Optional[int] = Field(default=None, description="Primary identifier for the entity")
|
||||
title: Optional[str] = Field(default=None, description="Primary text to display")
|
||||
subtitle: Optional[str] = Field(default=None, description="Secondary text (e.g. author or category)")
|
||||
extra: Optional[str] = Field(default=None, description="Additional descriptive snippet")
|
||||
post_id: Optional[int] = Field(default=None, description="Associated post id for comment results")
|
||||
url: Optional[str] = Field(default=None, description="Deep link to the resource inside OpenIsle")
|
||||
highlights: Dict[str, Optional[str]] = Field(
|
||||
default_factory=dict,
|
||||
description="Highlighted HTML fragments keyed by field name",
|
||||
)
|
||||
|
||||
|
||||
class SearchItem(BaseModel):
|
||||
"""Normalized representation of a single search result."""
|
||||
|
||||
category: str = Field(description="Type/category of the search result, e.g. user or post.")
|
||||
title: Optional[str] = Field(default=None, description="Primary title or label for the result.")
|
||||
description: Optional[str] = Field(default=None, description="Supporting description or summary text.")
|
||||
url: Optional[str] = Field(default=None, description="Canonical URL that references the resource, if available.")
|
||||
metadata: Dict[str, Any] = Field(default_factory=dict, description="Additional structured metadata extracted from the API.")
|
||||
highlights: Optional[Highlight] = Field(default=None, description="Highlighted snippets returned by the backend search API.")
|
||||
model_config = ConfigDict(populate_by_name=True)
|
||||
|
||||
|
||||
class SearchResponse(BaseModel):
|
||||
"""Structured response returned by the MCP search tool."""
|
||||
"""Response envelope returned from the MCP search tool."""
|
||||
|
||||
scope: SearchScope = Field(description="Scope of the search that produced the results.")
|
||||
keyword: str = Field(description="Keyword submitted to the backend search endpoint.")
|
||||
results: list[SearchItem] = Field(default_factory=list, description="Normalized search results from the backend API.")
|
||||
keyword: str = Field(description="Sanitised keyword that was searched for")
|
||||
total_results: int = Field(description="Total number of results returned by the backend")
|
||||
limit: int = Field(description="Maximum number of results included in the response")
|
||||
results: list[SearchResult] = Field(default_factory=list, description="Search results up to the requested limit")
|
||||
|
||||
model_config = ConfigDict(populate_by_name=True)
|
||||
|
||||
@@ -1,100 +0,0 @@
|
||||
"""Utilities for normalising OpenIsle search results."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import re
|
||||
from typing import Any, Iterable
|
||||
|
||||
from .models import Highlight, SearchItem, SearchScope
|
||||
|
||||
|
||||
def _truncate(text: str | None, *, limit: int = 240) -> str | None:
|
||||
"""Compress whitespace and truncate overly long text fragments."""
|
||||
|
||||
if not text:
|
||||
return None
|
||||
compact = re.sub(r"\s+", " ", text).strip()
|
||||
if len(compact) <= limit:
|
||||
return compact
|
||||
return f"{compact[:limit - 1]}…"
|
||||
|
||||
|
||||
def _extract_highlight(data: dict[str, Any]) -> Highlight | None:
|
||||
highlighted = {
|
||||
"text": data.get("highlightedText"),
|
||||
"sub_text": data.get("highlightedSubText"),
|
||||
"extra": data.get("highlightedExtra"),
|
||||
}
|
||||
if any(highlighted.values()):
|
||||
return Highlight(**highlighted)
|
||||
return None
|
||||
|
||||
|
||||
def normalise_results(scope: SearchScope, payload: Iterable[dict[str, Any]]) -> list[SearchItem]:
|
||||
"""Convert backend payloads into :class:`SearchItem` entries."""
|
||||
|
||||
normalised: list[SearchItem] = []
|
||||
|
||||
for item in payload:
|
||||
if not isinstance(item, dict):
|
||||
continue
|
||||
|
||||
if scope is SearchScope.GLOBAL:
|
||||
normalised.append(
|
||||
SearchItem(
|
||||
category=item.get("type", scope.value),
|
||||
title=_truncate(item.get("text")),
|
||||
description=_truncate(item.get("subText")),
|
||||
metadata={
|
||||
"id": item.get("id"),
|
||||
"postId": item.get("postId"),
|
||||
"extra": item.get("extra"),
|
||||
},
|
||||
highlights=_extract_highlight(item),
|
||||
)
|
||||
)
|
||||
continue
|
||||
|
||||
if scope in {SearchScope.POSTS, SearchScope.POSTS_CONTENT, SearchScope.POSTS_TITLE}:
|
||||
author = item.get("author") or {}
|
||||
category = item.get("category") or {}
|
||||
metadata = {
|
||||
"id": item.get("id"),
|
||||
"author": author.get("username"),
|
||||
"category": category.get("name"),
|
||||
"views": item.get("views"),
|
||||
"commentCount": item.get("commentCount"),
|
||||
"tags": [tag.get("name") for tag in item.get("tags", []) if isinstance(tag, dict)],
|
||||
}
|
||||
normalised.append(
|
||||
SearchItem(
|
||||
category="post",
|
||||
title=_truncate(item.get("title")),
|
||||
description=_truncate(item.get("content")),
|
||||
metadata={k: v for k, v in metadata.items() if v is not None},
|
||||
)
|
||||
)
|
||||
continue
|
||||
|
||||
if scope is SearchScope.USERS:
|
||||
metadata = {
|
||||
"id": item.get("id"),
|
||||
"email": item.get("email"),
|
||||
"followers": item.get("followers"),
|
||||
"following": item.get("following"),
|
||||
"role": item.get("role"),
|
||||
}
|
||||
normalised.append(
|
||||
SearchItem(
|
||||
category="user",
|
||||
title=_truncate(item.get("username")),
|
||||
description=_truncate(item.get("introduction")),
|
||||
metadata={k: v for k, v in metadata.items() if v is not None},
|
||||
)
|
||||
)
|
||||
continue
|
||||
|
||||
# Fallback: include raw entry to aid debugging of unsupported scopes
|
||||
normalised.append(SearchItem(category=scope.value, metadata=item))
|
||||
|
||||
return normalised
|
||||
@@ -2,120 +2,163 @@
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import logging
|
||||
import os
|
||||
from typing import Annotated
|
||||
from typing import Annotated, Optional
|
||||
|
||||
from mcp.server.fastmcp import Context, FastMCP
|
||||
from mcp.server.fastmcp.logging import configure_logging
|
||||
from mcp.server.fastmcp import exceptions as mcp_exceptions
|
||||
from pydantic import Field
|
||||
from starlette.requests import Request
|
||||
from starlette.responses import JSONResponse, Response
|
||||
|
||||
from .client import OpenIsleAPI
|
||||
from .config import Settings, get_settings
|
||||
from .models import SearchResponse, SearchScope
|
||||
from .search import normalise_results
|
||||
from .client import BackendClientError, OpenIsleBackendClient
|
||||
from .models import BackendSearchResult, SearchResponse, SearchResult
|
||||
|
||||
_logger = logging.getLogger(__name__)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
APP_NAME = "openisle-mcp"
|
||||
DEFAULT_BACKEND_URL = "http://springboot:8080"
|
||||
DEFAULT_TRANSPORT = "stdio"
|
||||
DEFAULT_TIMEOUT = 10.0
|
||||
DEFAULT_LIMIT = 20
|
||||
MAX_LIMIT = 50
|
||||
|
||||
server = FastMCP(
|
||||
APP_NAME,
|
||||
instructions=(
|
||||
"Use the `search` tool to query OpenIsle content. "
|
||||
"Results include posts, comments, users, categories, and tags."
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
def _create_server(settings: Settings) -> FastMCP:
|
||||
"""Instantiate the FastMCP server with configured metadata."""
|
||||
def _env(name: str, default: Optional[str] = None) -> Optional[str]:
|
||||
value = os.getenv(name, default)
|
||||
if value is None:
|
||||
return None
|
||||
trimmed = value.strip()
|
||||
return trimmed or default
|
||||
|
||||
server = FastMCP(
|
||||
name="OpenIsle MCP",
|
||||
instructions=(
|
||||
"Access OpenIsle search functionality. Provide a keyword and optionally a scope to "
|
||||
"discover users and posts from the community."
|
||||
),
|
||||
host=settings.host,
|
||||
port=settings.port,
|
||||
transport_security=None,
|
||||
|
||||
def _load_timeout() -> float:
|
||||
raw = _env("OPENISLE_BACKEND_TIMEOUT", str(DEFAULT_TIMEOUT))
|
||||
try:
|
||||
timeout = float(raw) if raw is not None else DEFAULT_TIMEOUT
|
||||
except ValueError:
|
||||
logger.warning("Invalid OPENISLE_BACKEND_TIMEOUT value '%s', falling back to %s", raw, DEFAULT_TIMEOUT)
|
||||
return DEFAULT_TIMEOUT
|
||||
if timeout <= 0:
|
||||
logger.warning("Non-positive OPENISLE_BACKEND_TIMEOUT %s, falling back to %s", timeout, DEFAULT_TIMEOUT)
|
||||
return DEFAULT_TIMEOUT
|
||||
return timeout
|
||||
|
||||
|
||||
_BACKEND_CLIENT = OpenIsleBackendClient(
|
||||
base_url=_env("OPENISLE_BACKEND_URL", DEFAULT_BACKEND_URL) or DEFAULT_BACKEND_URL,
|
||||
timeout=_load_timeout(),
|
||||
)
|
||||
_PUBLIC_BASE_URL = _env("OPENISLE_PUBLIC_BASE_URL")
|
||||
|
||||
|
||||
def _build_url(result: BackendSearchResult) -> Optional[str]:
|
||||
if not _PUBLIC_BASE_URL:
|
||||
return None
|
||||
base = _PUBLIC_BASE_URL.rstrip("/")
|
||||
if result.type in {"post", "post_title"} and result.id is not None:
|
||||
return f"{base}/posts/{result.id}"
|
||||
if result.type == "comment" and result.post_id is not None:
|
||||
anchor = f"#comment-{result.id}" if result.id is not None else ""
|
||||
return f"{base}/posts/{result.post_id}{anchor}"
|
||||
if result.type == "user" and result.id is not None:
|
||||
return f"{base}/users/{result.id}"
|
||||
if result.type == "category" and result.id is not None:
|
||||
return f"{base}/?categoryId={result.id}"
|
||||
if result.type == "tag" and result.id is not None:
|
||||
return f"{base}/?tagIds={result.id}"
|
||||
return None
|
||||
|
||||
|
||||
def _to_search_result(result: BackendSearchResult) -> SearchResult:
|
||||
highlights = {
|
||||
"text": result.highlighted_text,
|
||||
"subText": result.highlighted_sub_text,
|
||||
"extra": result.highlighted_extra,
|
||||
}
|
||||
# Remove empty highlight entries to keep the payload clean
|
||||
highlights = {key: value for key, value in highlights.items() if value}
|
||||
return SearchResult(
|
||||
type=result.type,
|
||||
id=result.id,
|
||||
title=result.text,
|
||||
subtitle=result.sub_text,
|
||||
extra=result.extra,
|
||||
post_id=result.post_id,
|
||||
url=_build_url(result),
|
||||
highlights=highlights,
|
||||
)
|
||||
|
||||
@server.custom_route("/health", methods=["GET"])
|
||||
async def health(_: Request) -> Response: # pragma: no cover - exercised via runtime checks
|
||||
return JSONResponse({"status": "ok"})
|
||||
|
||||
return server
|
||||
KeywordParam = Annotated[str, Field(description="Keyword to search for", min_length=1)]
|
||||
LimitParam = Annotated[
|
||||
int,
|
||||
Field(ge=1, le=MAX_LIMIT, description=f"Maximum number of results to return (<= {MAX_LIMIT})"),
|
||||
]
|
||||
|
||||
|
||||
async def _execute_search(
|
||||
*,
|
||||
api: OpenIsleAPI,
|
||||
scope: SearchScope,
|
||||
keyword: str,
|
||||
context: Context | None,
|
||||
) -> SearchResponse:
|
||||
message = f"Searching OpenIsle scope={scope.value} keyword={keyword!r}"
|
||||
if context is not None:
|
||||
context.info(message)
|
||||
else:
|
||||
_logger.info(message)
|
||||
@server.tool(name="search", description="Search OpenIsle content")
|
||||
async def search(keyword: KeywordParam, limit: LimitParam = DEFAULT_LIMIT, ctx: Optional[Context] = None) -> SearchResponse:
|
||||
"""Run a search query against the OpenIsle backend."""
|
||||
|
||||
payload = await api.search(scope, keyword)
|
||||
items = normalise_results(scope, payload)
|
||||
return SearchResponse(scope=scope, keyword=keyword, results=items)
|
||||
trimmed = keyword.strip()
|
||||
if not trimmed:
|
||||
raise mcp_exceptions.ToolError("Keyword must not be empty")
|
||||
|
||||
if ctx is not None:
|
||||
await ctx.debug(f"Searching OpenIsle for '{trimmed}' (limit={limit})")
|
||||
|
||||
def build_server(settings: Settings | None = None) -> FastMCP:
|
||||
"""Configure and return the FastMCP server instance."""
|
||||
try:
|
||||
raw_results = await _BACKEND_CLIENT.search_global(trimmed)
|
||||
except BackendClientError as exc:
|
||||
if ctx is not None:
|
||||
await ctx.error(f"Search request failed: {exc}")
|
||||
raise mcp_exceptions.ToolError(f"Search failed: {exc}") from exc
|
||||
|
||||
resolved_settings = settings or get_settings()
|
||||
server = _create_server(resolved_settings)
|
||||
api_client = OpenIsleAPI(resolved_settings)
|
||||
results = [_to_search_result(result) for result in raw_results]
|
||||
limited = results[:limit]
|
||||
|
||||
@server.tool(
|
||||
name="openisle_search",
|
||||
description="Search OpenIsle for users and posts.",
|
||||
)
|
||||
async def openisle_search(
|
||||
keyword: Annotated[str, Field(description="Keyword used to query OpenIsle search.")],
|
||||
scope: Annotated[
|
||||
SearchScope,
|
||||
Field(
|
||||
description=(
|
||||
"Scope of the search. Use 'global' to search across users and posts, or specify "
|
||||
"'users', 'posts', 'posts_title', or 'posts_content' to narrow the results."
|
||||
)
|
||||
),
|
||||
] = SearchScope.GLOBAL,
|
||||
context: Context | None = None,
|
||||
) -> SearchResponse:
|
||||
try:
|
||||
return await _execute_search(api=api_client, scope=scope, keyword=keyword, context=context)
|
||||
except Exception as exc: # pragma: no cover - surfaced to the MCP runtime
|
||||
error_message = f"Search failed: {exc}"
|
||||
if context is not None:
|
||||
context.error(error_message)
|
||||
_logger.exception("Search tool failed")
|
||||
raise
|
||||
if ctx is not None:
|
||||
await ctx.info(
|
||||
"Search completed",
|
||||
keyword=trimmed,
|
||||
total_results=len(results),
|
||||
returned=len(limited),
|
||||
)
|
||||
|
||||
return server
|
||||
return SearchResponse(keyword=trimmed, total_results=len(results), limit=limit, results=limited)
|
||||
|
||||
|
||||
def main() -> None:
|
||||
"""CLI entry point used by the console script."""
|
||||
parser = argparse.ArgumentParser(description="Run the OpenIsle MCP server")
|
||||
parser.add_argument(
|
||||
"--transport",
|
||||
choices=["stdio", "sse", "streamable-http"],
|
||||
default=_env("OPENISLE_MCP_TRANSPORT", DEFAULT_TRANSPORT),
|
||||
help="Transport protocol to use",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--mount-path",
|
||||
default=_env("OPENISLE_MCP_SSE_MOUNT_PATH", "/mcp"),
|
||||
help="Mount path when using the SSE transport",
|
||||
)
|
||||
args = parser.parse_args()
|
||||
|
||||
settings = get_settings()
|
||||
configure_logging("INFO")
|
||||
server = build_server(settings)
|
||||
logging.basicConfig(level=os.getenv("OPENISLE_MCP_LOG_LEVEL", "INFO"))
|
||||
logger.info(
|
||||
"Starting OpenIsle MCP server", extra={"transport": args.transport, "backend": _BACKEND_CLIENT.base_url}
|
||||
)
|
||||
|
||||
transport = os.getenv("OPENISLE_MCP_TRANSPORT", settings.transport)
|
||||
if transport not in {"stdio", "sse", "streamable-http"}:
|
||||
raise RuntimeError(f"Unsupported transport mode: {transport}")
|
||||
|
||||
_logger.info("Starting OpenIsle MCP server on %s:%s via %s", settings.host, settings.port, transport)
|
||||
|
||||
if transport == "stdio":
|
||||
server.run("stdio")
|
||||
elif transport == "sse":
|
||||
mount_path = os.getenv("OPENISLE_MCP_SSE_PATH")
|
||||
server.run("sse", mount_path=mount_path)
|
||||
else:
|
||||
server.run("streamable-http")
|
||||
server.run(transport=args.transport, mount_path=args.mount_path)
|
||||
|
||||
|
||||
if __name__ == "__main__": # pragma: no cover - manual execution path
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
|
||||
Reference in New Issue
Block a user