Compare commits
1 Commits
user-auth-
...
feature/do
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
8942760a5a |
31
README.md
31
README.md
@@ -1,6 +1,6 @@
|
||||
# kafka可视化管理平台
|
||||
一款轻量级的kafka可视化管理平台,安装配置快捷、简单易用。
|
||||
为了开发的省事,没有国际化支持,页面只支持中文展示。
|
||||
为了开发的省事,没有国际化支持,只支持中文展示。
|
||||
用过rocketmq-console吧,对,前端展示风格跟那个有点类似。
|
||||
|
||||
## 页面预览
|
||||
@@ -8,9 +8,7 @@
|
||||
## 集群迁移支持说明
|
||||
当前主分支及日后版本不再提供消息同步、集群迁移的解决方案,如有需要,查看:[集群迁移说明](./document/datasync/集群迁移.md)
|
||||
## ACL说明
|
||||
最新代码运行即可看到acl菜单,将权限管理和认证的用户管理(SASL_SCRAM)进行了分离。分离之后,支持只开启SASL_SCRAM认证的时候(未开启鉴权),用户变更操作。或者使用其它认证机制下的权限管理操作(可视化的权限管理),但是可视化的认证用户管理目前只支持Scram。
|
||||
|
||||
v1.0.6版本之前,如果kafka集群启用了ACL,但是控制台没看到Acl菜单,可以查看:[Acl配置启用说明](./document/acl/Acl.md)
|
||||
acl配置说明,如果kafka集群启用了ACL,但是控制台没看到Acl菜单,可以查看:[Acl配置启用说明](./document/acl/Acl.md)
|
||||
## 功能支持
|
||||
* 多集群支持
|
||||
* 集群信息
|
||||
@@ -18,18 +16,13 @@ v1.0.6版本之前,如果kafka集群启用了ACL,但是控制台没看到Acl
|
||||
* 消费组管理
|
||||
* 消息管理
|
||||
* ACL
|
||||
* 客户端限流
|
||||
* 运维
|
||||
|
||||
功能明细看这个脑图:
|
||||

|
||||
|
||||
## 安装包下载
|
||||
点击下载(v1.0.7版本):[kafka-console-ui.zip](https://github.com/xxd763795151/kafka-console-ui/releases/download/v1.0.7/kafka-console-ui.zip)
|
||||
|
||||
如果安装包下载的比较慢,可以查看下面的源码打包说明,把代码下载下来,本地快速打包.
|
||||
|
||||
github下载慢也可以试试从gitee下载,点击下载[gitee来源kafka-console-ui.zip](https://gitee.com/xiaodong_xu/kafka-console-ui/releases/download/v1.0.7/kafka-console-ui.zip)
|
||||
点击下载(v1.0.4版本):[kafka-console-ui.zip](https://github.com/xxd763795151/kafka-console-ui/releases/download/v1.0.4/kafka-console-ui.zip)
|
||||
|
||||
## 快速使用
|
||||
### Windows
|
||||
@@ -68,7 +61,7 @@ sh bin/shutdown.sh
|
||||
在新增集群的时候,除了集群地址还可以输入集群的其它属性配置,比如请求超时,ACL配置等。如果开启了ACL,切换到该集群的时候,导航栏上便会出现ACL菜单,支持进行相关操作(目前是基于SASL_SCRAM认证授权管理支持的最完善,其它的我也没验证过,虽然是我开发的,但是我也没具体全部验证这一块功能,授权部分应该是通用的)
|
||||
|
||||
## kafka版本
|
||||
* 当前使用的kafka 3.2.0
|
||||
* 当前使用的kafka 2.8.0
|
||||
## 监控
|
||||
仅提供运维管理功能,监控、告警需要配合其它组件,如有需要,建议请查看:https://blog.csdn.net/x763795151/article/details/119705372
|
||||
|
||||
@@ -78,22 +71,10 @@ sh bin/shutdown.sh
|
||||
## 本地开发
|
||||
如果需要本地开发,开发环境配置查看:[本地开发](./document/develop/开发配置.md)
|
||||
|
||||
## 登录认证和权限
|
||||
目前主分支不支持登录认证,感谢@dongyinuo 同学开发了一版支持登录认证,及相关的按钮权限(主要有两个角色:管理员和普通开发人员)。
|
||||
在分支:feature/dongyinuo/20220501/devops 上。
|
||||
如果有需要使用管理台登录认证的,可以切换到这个分支上进行打包,打包方式看 源码打包 说明。
|
||||
默认登录账户:admin/kafka-console-ui521
|
||||
|
||||
## DockerCompose部署
|
||||
感谢@wdkang123 同学分享的部署方式,如果有需要请查看[DockerCompose部署方式](./document/deploy/docker部署.md)
|
||||
|
||||
## 联系方式
|
||||
+ 微信群
|
||||
|
||||
<img src="./document/contact/weixin_contact.jpg" width="40%"/>
|
||||
|
||||
[//]: # (<img src="https://github.com/xxd763795151/kafka-console-ui/blob/main/document/contact/weixin_contact.jpg" width="40%"/>)
|
||||
<img src="https://github.com/dongyinuo/kafka-console-ui/blob/feature/dongyinuo/add/contact/document/contact/weixin_contact.jpeg" width="40%"/>
|
||||
|
||||
+ 若联系方式失效, 请联系加一下微信, 说明意图
|
||||
- xxd763795151
|
||||
- wxid_7jy2ezljvebt12
|
||||
- wxid_7jy2ezljvebt12
|
||||
@@ -5,4 +5,4 @@ set JAVA_OPTS=-Xmx512m -Xms512m -Xmn256m -Xss256k
|
||||
set CONFIG_FILE=../config/application.yml
|
||||
set TARGET=../lib/kafka-console-ui.jar
|
||||
set DATA_DIR=..
|
||||
"%JAVA_CMD%" -jar %TARGET% --spring.config.location=%CONFIG_FILE% --data.dir=%DATA_DIR%
|
||||
%JAVA_CMD% -jar %TARGET% --spring.config.location=%CONFIG_FILE% --data.dir=%DATA_DIR%
|
||||
@@ -26,7 +26,7 @@ kafka:
|
||||
其中说明了kafka.config.enable-acl配置项需要为true。
|
||||
|
||||
注意:**现在不再支持这种方式了**
|
||||
## v1.0.6之前的版本说明
|
||||
## 新版本说明
|
||||
因为现在支持多集群配置,关于多集群配置,可以看主页说明的 配置集群 介绍。
|
||||
所以这里把这些额外的配置项都去掉了。
|
||||
|
||||
|
||||
BIN
document/contact/weixin_contact.jpeg
Normal file
BIN
document/contact/weixin_contact.jpeg
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 128 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 178 KiB |
@@ -1,189 +0,0 @@
|
||||
# Docker/DockerCompose部署
|
||||
|
||||
# 1.快速上手
|
||||
|
||||
## 1.1 镜像拉取
|
||||
|
||||
```shell
|
||||
docker pull wdkang/kafka-console-ui
|
||||
```
|
||||
|
||||
|
||||
|
||||
## 1.2 查看镜像
|
||||
|
||||
```shell
|
||||
docker images
|
||||
```
|
||||
|
||||
|
||||
|
||||
## 1.3 启动服务
|
||||
|
||||
由于Docker内不会对数据进行持久化 所以这里推荐将数据目录映射到实体机中
|
||||
|
||||
详见 **2.数据持久**
|
||||
|
||||
```shell
|
||||
docker run -d -p 7766:7766 wdkang/kafka-console-ui
|
||||
```
|
||||
|
||||
|
||||
|
||||
## 1.4 查看状态
|
||||
|
||||
```shell
|
||||
docker ps -a
|
||||
```
|
||||
|
||||
|
||||
|
||||
## 1.5 查看日志
|
||||
|
||||
```shell
|
||||
docker logs -f ${containerId}
|
||||
```
|
||||
|
||||
|
||||
|
||||
## 1.6 访问服务
|
||||
|
||||
```shell
|
||||
http://localhost:7766
|
||||
```
|
||||
|
||||
|
||||
|
||||
# 2. 数据持久
|
||||
|
||||
推荐对数据进行持久化
|
||||
|
||||
## 2.1 新建目录
|
||||
|
||||
```shell
|
||||
mkdir -p /home/kafka-console-ui/data /home/kafka-console-ui/log
|
||||
cd /home/kafka-console-ui
|
||||
```
|
||||
|
||||
## 2.2 启动服务
|
||||
|
||||
```shell
|
||||
docker run -d -p 7766:7766 -v $PWD/data:/app/data -v $PWD/log:/app/log wdkang/kafka-console-ui
|
||||
```
|
||||
|
||||
|
||||
|
||||
# 3.自主打包
|
||||
|
||||
## 3.1 构建镜像
|
||||
|
||||
**前置需求**
|
||||
|
||||
(可根据自身情况修改Dockerfile)
|
||||
|
||||
下载[kafka-console-ui.zip](https://github.com/xxd763795151/kafka-console-ui/releases)包
|
||||
|
||||
解压后 将Dockerfile放入文件夹的根目录
|
||||
|
||||
**Dockerfile**
|
||||
|
||||
```dockerfile
|
||||
# jdk
|
||||
FROM openjdk:8-jdk-alpine
|
||||
# label
|
||||
LABEL by="https://github.com/xxd763795151/kafka-console-ui"
|
||||
# root
|
||||
RUN mkdir -p /app && cd /app
|
||||
WORKDIR /app
|
||||
# config log data
|
||||
RUN mkdir -p /app/config && mkdir -p /app/log && mkdir -p /app/data && mkdir -p /app/lib
|
||||
# add file
|
||||
ADD ./lib/kafka-console-ui.jar /app/lib
|
||||
ADD ./config /app/config
|
||||
# port
|
||||
EXPOSE 7766
|
||||
# start server
|
||||
CMD java -jar -Xmx512m -Xms512m -Xmn256m -Xss256k /app/lib/kafka-console-ui.jar --spring.config.location="/app/config/" --logging.home="/app/log" --data.dir="/app/data"
|
||||
|
||||
```
|
||||
|
||||
**进行打包**
|
||||
|
||||
在文件夹根目录下
|
||||
|
||||
(注意末尾有个点)
|
||||
|
||||
```shell
|
||||
docker build -t ${your_docker_hub_addr} .
|
||||
```
|
||||
|
||||
|
||||
|
||||
## 3.2 上传镜像
|
||||
|
||||
```shell
|
||||
docker push ${your_docker_hub_addr}
|
||||
```
|
||||
|
||||
|
||||
|
||||
# 4.容器编排
|
||||
|
||||
```dockerfile
|
||||
# docker-compose 编排
|
||||
version: '3'
|
||||
services:
|
||||
# 服务名
|
||||
kafka-console-ui:
|
||||
# 容器名
|
||||
container_name: "kafka-console-ui"
|
||||
# 端口
|
||||
ports:
|
||||
- "7766:7766"
|
||||
# 持久化
|
||||
volumes:
|
||||
- ./data:/app/data
|
||||
- ./log:/app/log
|
||||
# 防止读写文件有问题
|
||||
privileged: true
|
||||
user: root
|
||||
# 镜像地址
|
||||
image: "wdkang/kafka-console-ui"
|
||||
```
|
||||
|
||||
|
||||
|
||||
## 4.1 拉取镜像
|
||||
|
||||
```shell
|
||||
docker-compose pull kafka-console-ui
|
||||
```
|
||||
|
||||
|
||||
|
||||
## 4.2 构建启动
|
||||
|
||||
```shell
|
||||
docker-compose up --detach --build kafka-console-ui
|
||||
```
|
||||
|
||||
|
||||
|
||||
## 4.3 查看状态
|
||||
|
||||
```shell
|
||||
docker-compose ps -a
|
||||
```
|
||||
|
||||
|
||||
|
||||
## 4.3 停止服务
|
||||
|
||||
```shell
|
||||
docker-compose down
|
||||
```
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
@@ -12,11 +12,8 @@
|
||||
* scala 2.13
|
||||
* maven >=3.6+
|
||||
* webstorm
|
||||
* Node
|
||||
|
||||
除了webstorm是开发前端的ide可以根据自己需要代替,jdk scala是必须有的。
|
||||
|
||||
开发的时候,我本地用的node版本是v14.16.0,下载目录:https://nodejs.org/download/release/v14.16.0/ . 过高或过低版本是否适用,我也没测试过。
|
||||
|
||||
scala 2.13下载地址,在这个页面最下面:https://www.scala-lang.org/download/scala2.html
|
||||
## 克隆代码
|
||||
@@ -24,8 +21,7 @@ scala 2.13下载地址,在这个页面最下面:https://www.scala-lang.org/d
|
||||
## 后端配置
|
||||
1. 用idea打开项目
|
||||
2. 打开idea的Project Structure(Settings) -> Modules -> 设置src/main/scala为Sources,因为约定src/main/java是源码目录,所以这里要再加一个源码目录
|
||||
3. 打开idea的Settings -> plugins 搜索scala plugin并安装,然后应该是要重启idea生效,这一步必须在第4步之前
|
||||
4. 打开idea的Project Structure(Settings) -> Libraries 添加scala sdk,然后选择本地下载的scala 2.13的目录,确定添加进来(如果使用的idea可以直接勾选,也可以不用先下载到本地)
|
||||
3. 打开idea的Project Structure(Settings) -> Libraries 添加scala sdk,然后选择本地下载的scala 2.13的目录,确定添加进来(如果使用的idea可以直接勾选,也可以不用先下载到本地)
|
||||
## 前端
|
||||
前端代码在工程的ui目录下,找个前端开发的ide如web storm打开进行开发即可。
|
||||
|
||||
|
||||
Binary file not shown.
|
Before Width: | Height: | Size: 99 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 439 KiB After Width: | Height: | Size: 99 KiB |
@@ -29,6 +29,4 @@ package.bat
|
||||
cd kafka-console-ui
|
||||
# linux或mac执行
|
||||
sh package.sh
|
||||
```
|
||||
|
||||
打包完成,会在target目录下生成一个kafka-console-ui.zip的安装包
|
||||
```
|
||||
31
pom.xml
31
pom.xml
@@ -10,7 +10,7 @@
|
||||
</parent>
|
||||
<groupId>com.xuxd</groupId>
|
||||
<artifactId>kafka-console-ui</artifactId>
|
||||
<version>1.0.7</version>
|
||||
<version>1.0.4</version>
|
||||
<name>kafka-console-ui</name>
|
||||
<description>Kafka console manage ui</description>
|
||||
<properties>
|
||||
@@ -21,13 +21,18 @@
|
||||
<ui.path>${project.basedir}/ui</ui.path>
|
||||
<frontend-maven-plugin.version>1.11.0</frontend-maven-plugin.version>
|
||||
<compiler.version>1.8</compiler.version>
|
||||
<kafka.version>3.2.0</kafka.version>
|
||||
<kafka.version>2.8.0</kafka.version>
|
||||
<maven.assembly.plugin.version>3.0.0</maven.assembly.plugin.version>
|
||||
<mybatis-plus-boot-starter.version>3.4.2</mybatis-plus-boot-starter.version>
|
||||
<scala.version>2.13.6</scala.version>
|
||||
<spring-framework.version>5.3.26</spring-framework.version>
|
||||
<jwt.version>0.9.0</jwt.version>
|
||||
</properties>
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>io.jsonwebtoken</groupId>
|
||||
<artifactId>jjwt</artifactId>
|
||||
<version>${jwt.version}</version>
|
||||
</dependency>
|
||||
<dependency>
|
||||
<groupId>org.scala-lang</groupId>
|
||||
<artifactId>scala-library</artifactId>
|
||||
@@ -49,11 +54,6 @@
|
||||
<artifactId>spring-boot-starter-web</artifactId>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-starter-aop</artifactId>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>org.springframework.boot</groupId>
|
||||
<artifactId>spring-boot-starter-test</artifactId>
|
||||
@@ -82,18 +82,6 @@
|
||||
<groupId>org.apache.kafka</groupId>
|
||||
<artifactId>kafka_2.13</artifactId>
|
||||
<version>${kafka.version}</version>
|
||||
<exclusions>
|
||||
<exclusion>
|
||||
<groupId>com.typesafe.scala-logging</groupId>
|
||||
<artifactId>scala-logging_2.13</artifactId>
|
||||
</exclusion>
|
||||
</exclusions>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>com.typesafe.scala-logging</groupId>
|
||||
<artifactId>scala-logging_2.13</artifactId>
|
||||
<version>3.9.2</version>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
@@ -225,8 +213,7 @@
|
||||
<goal>npm</goal>
|
||||
</goals>
|
||||
<configuration>
|
||||
<!-- <arguments>install --registry=https://registry.npmjs.org/</arguments>-->
|
||||
<arguments>install --registry=https://registry.npm.taobao.org</arguments>
|
||||
<arguments>install --registry=https://registry.npmjs.org/</arguments>
|
||||
</configuration>
|
||||
</execution>
|
||||
<execution>
|
||||
|
||||
@@ -6,9 +6,6 @@ import org.springframework.boot.autoconfigure.SpringBootApplication;
|
||||
import org.springframework.boot.web.servlet.ServletComponentScan;
|
||||
import org.springframework.scheduling.annotation.EnableScheduling;
|
||||
|
||||
/**
|
||||
* @author 晓东哥哥
|
||||
*/
|
||||
@MapperScan("com.xuxd.kafka.console.dao")
|
||||
@SpringBootApplication
|
||||
@EnableScheduling
|
||||
|
||||
@@ -1,117 +0,0 @@
|
||||
package com.xuxd.kafka.console.aspect;
|
||||
|
||||
import com.xuxd.kafka.console.aspect.annotation.ControllerLog;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.aspectj.lang.ProceedingJoinPoint;
|
||||
import org.aspectj.lang.annotation.Around;
|
||||
import org.aspectj.lang.annotation.Aspect;
|
||||
import org.aspectj.lang.annotation.Pointcut;
|
||||
import org.springframework.core.annotation.Order;
|
||||
import org.springframework.stereotype.Component;
|
||||
|
||||
import java.lang.reflect.Method;
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
import java.util.concurrent.locks.ReentrantLock;
|
||||
|
||||
/**
|
||||
* @author 晓东哥哥
|
||||
*/
|
||||
@Slf4j
|
||||
@Order(-1)
|
||||
@Aspect
|
||||
@Component
|
||||
public class ControllerLogAspect {
|
||||
|
||||
private Map<String, String> descMap = new HashMap<>();
|
||||
|
||||
private ReentrantLock lock = new ReentrantLock();
|
||||
|
||||
@Pointcut("@annotation(com.xuxd.kafka.console.aspect.annotation.ControllerLog)")
|
||||
private void pointcut() {
|
||||
|
||||
}
|
||||
|
||||
@Around("pointcut()")
|
||||
public Object around(ProceedingJoinPoint joinPoint) throws Throwable {
|
||||
StringBuilder params = new StringBuilder("[");
|
||||
try {
|
||||
String methodName = getMethodFullName(joinPoint.getTarget().getClass().getName(), joinPoint.getSignature().getName());
|
||||
|
||||
if (!descMap.containsKey(methodName)) {
|
||||
cacheDescInfo(joinPoint);
|
||||
}
|
||||
|
||||
Object[] args = joinPoint.getArgs();
|
||||
long startTime = System.currentTimeMillis();
|
||||
Object res = joinPoint.proceed();
|
||||
long endTime = System.currentTimeMillis();
|
||||
|
||||
for (int i = 0; i < args.length; i++) {
|
||||
params.append(args[i]);
|
||||
}
|
||||
params.append("]");
|
||||
|
||||
String resStr = "[" + (res != null ? res.toString() : "") + "]";
|
||||
|
||||
StringBuilder sb = new StringBuilder();
|
||||
String shortMethodName = descMap.getOrDefault(methodName, ".-");
|
||||
shortMethodName = shortMethodName.substring(shortMethodName.lastIndexOf(".") + 1);
|
||||
sb.append("[").append(shortMethodName)
|
||||
.append("调用完成: ")
|
||||
.append("请求参数=").append(params).append(", ")
|
||||
.append("响应值=").append(resStr).append(", ")
|
||||
.append("耗时=").append(endTime - startTime)
|
||||
.append(" ms");
|
||||
log.info(sb.toString());
|
||||
return res;
|
||||
} catch (Throwable e) {
|
||||
log.error("调用方法异常, 请求参数:" + params, e);
|
||||
throw e;
|
||||
}
|
||||
}
|
||||
|
||||
private void cacheDescInfo(ProceedingJoinPoint joinPoint) {
|
||||
lock.lock();
|
||||
try {
|
||||
String methodName = joinPoint.getSignature().getName();
|
||||
Class<?> aClass = joinPoint.getTarget().getClass();
|
||||
|
||||
Method method = null;
|
||||
try {
|
||||
Object[] args = joinPoint.getArgs();
|
||||
|
||||
Class<?>[] clzArr = new Class[args.length];
|
||||
for (int i = 0; i < args.length; i++) {
|
||||
clzArr[i] = args[i].getClass();
|
||||
}
|
||||
method = aClass.getDeclaredMethod(methodName, clzArr);
|
||||
|
||||
} catch (Exception e) {
|
||||
log.warn("cacheDescInfo error: {}", e.getMessage());
|
||||
}
|
||||
|
||||
String fullMethodName = getMethodFullName(aClass.getName(), methodName);
|
||||
String desc = "[" + fullMethodName + "]";
|
||||
if (method == null) {
|
||||
descMap.put(fullMethodName, desc);
|
||||
return;
|
||||
}
|
||||
|
||||
ControllerLog controllerLog = method.getAnnotation(ControllerLog.class);
|
||||
String value = controllerLog.value();
|
||||
if (StringUtils.isBlank(value)) {
|
||||
descMap.put(fullMethodName, desc);
|
||||
} else {
|
||||
descMap.put(fullMethodName, value);
|
||||
}
|
||||
} finally {
|
||||
lock.unlock();
|
||||
}
|
||||
}
|
||||
|
||||
private String getMethodFullName(String className, String methodName) {
|
||||
return className + "#" + methodName;
|
||||
}
|
||||
}
|
||||
@@ -1,127 +0,0 @@
|
||||
package com.xuxd.kafka.console.aspect;
|
||||
|
||||
import com.baomidou.mybatisplus.core.conditions.query.QueryWrapper;
|
||||
import com.xuxd.kafka.console.aspect.annotation.Permission;
|
||||
import com.xuxd.kafka.console.beans.Credentials;
|
||||
import com.xuxd.kafka.console.beans.dos.SysUserDO;
|
||||
import com.xuxd.kafka.console.cache.RolePermCache;
|
||||
import com.xuxd.kafka.console.config.AuthConfig;
|
||||
import com.xuxd.kafka.console.dao.SysPermissionMapper;
|
||||
import com.xuxd.kafka.console.dao.SysRoleMapper;
|
||||
import com.xuxd.kafka.console.dao.SysUserMapper;
|
||||
import com.xuxd.kafka.console.exception.UnAuthorizedException;
|
||||
import com.xuxd.kafka.console.filter.CredentialsContext;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.aspectj.lang.JoinPoint;
|
||||
import org.aspectj.lang.annotation.Aspect;
|
||||
import org.aspectj.lang.annotation.Before;
|
||||
import org.aspectj.lang.annotation.Pointcut;
|
||||
import org.aspectj.lang.reflect.MethodSignature;
|
||||
import org.springframework.core.annotation.Order;
|
||||
import org.springframework.stereotype.Component;
|
||||
|
||||
import java.lang.reflect.Method;
|
||||
import java.util.*;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/5/17 22:32
|
||||
**/
|
||||
@Slf4j
|
||||
@Order(1)
|
||||
@Aspect
|
||||
@Component
|
||||
public class PermissionAspect {
|
||||
|
||||
|
||||
private Map<String, Set<String>> permMap = new HashMap<>();
|
||||
|
||||
private final AuthConfig authConfig;
|
||||
|
||||
private final SysUserMapper userMapper;
|
||||
|
||||
private final SysRoleMapper roleMapper;
|
||||
|
||||
private final SysPermissionMapper permissionMapper;
|
||||
|
||||
private final RolePermCache rolePermCache;
|
||||
|
||||
public PermissionAspect(AuthConfig authConfig,
|
||||
SysUserMapper userMapper,
|
||||
SysRoleMapper roleMapper,
|
||||
SysPermissionMapper permissionMapper,
|
||||
RolePermCache rolePermCache) {
|
||||
this.authConfig = authConfig;
|
||||
this.userMapper = userMapper;
|
||||
this.roleMapper = roleMapper;
|
||||
this.permissionMapper = permissionMapper;
|
||||
this.rolePermCache = rolePermCache;
|
||||
}
|
||||
|
||||
@Pointcut("@annotation(com.xuxd.kafka.console.aspect.annotation.Permission)")
|
||||
private void pointcut() {
|
||||
|
||||
}
|
||||
|
||||
@Before(value = "pointcut()")
|
||||
public void before(JoinPoint joinPoint) {
|
||||
if (!authConfig.isEnable()) {
|
||||
return;
|
||||
}
|
||||
MethodSignature signature = (MethodSignature) joinPoint.getSignature();
|
||||
Method method = signature.getMethod();
|
||||
Permission permission = method.getAnnotation(Permission.class);
|
||||
if (permission == null) {
|
||||
return;
|
||||
}
|
||||
String[] value = permission.value();
|
||||
if (value == null || value.length == 0) {
|
||||
return;
|
||||
}
|
||||
String name = method.getName() + "@" + method.hashCode();
|
||||
|
||||
Map<String, Set<String>> pm = checkPermMap(name, value);
|
||||
|
||||
Set<String> allowPermSet = pm.get(name);
|
||||
if (allowPermSet == null) {
|
||||
log.error("解析权限出现意外啦!!!");
|
||||
return;
|
||||
}
|
||||
|
||||
Credentials credentials = CredentialsContext.get();
|
||||
if (credentials == null || credentials.isInvalid()) {
|
||||
throw new UnAuthorizedException("credentials is invalid");
|
||||
}
|
||||
QueryWrapper<SysUserDO> queryWrapper = new QueryWrapper<>();
|
||||
queryWrapper.eq("username", credentials.getUsername());
|
||||
SysUserDO userDO = userMapper.selectOne(queryWrapper);
|
||||
if (userDO == null) {
|
||||
throw new UnAuthorizedException(credentials.getUsername() + ":" + allowPermSet);
|
||||
}
|
||||
|
||||
String roleIds = userDO.getRoleIds();
|
||||
List<Long> roleIdList = Arrays.stream(roleIds.split(",")).map(String::trim).filter(StringUtils::isNotEmpty).map(Long::valueOf).collect(Collectors.toList());
|
||||
for (Long roleId : roleIdList) {
|
||||
Set<String> permSet = rolePermCache.getRolePermCache().getOrDefault(roleId, Collections.emptySet());
|
||||
for (String p : allowPermSet) {
|
||||
if (permSet.contains(p)) {
|
||||
return;
|
||||
}
|
||||
}
|
||||
}
|
||||
throw new UnAuthorizedException(credentials.getUsername() + ":" + allowPermSet);
|
||||
}
|
||||
|
||||
private Map<String, Set<String>> checkPermMap(String methodName, String[] value) {
|
||||
if (!permMap.containsKey(methodName)) {
|
||||
Map<String, Set<String>> map = new HashMap<>(permMap);
|
||||
map.put(methodName, new HashSet<>(Arrays.asList(value)));
|
||||
permMap = map;
|
||||
return map;
|
||||
}
|
||||
return permMap;
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,15 +0,0 @@
|
||||
package com.xuxd.kafka.console.aspect.annotation;
|
||||
|
||||
import java.lang.annotation.*;
|
||||
|
||||
/**
|
||||
* 该注解用到controller层的方法上
|
||||
* @author 晓东哥哥
|
||||
*/
|
||||
@Inherited
|
||||
@Retention(RetentionPolicy.RUNTIME)
|
||||
@Target(ElementType.METHOD)
|
||||
public @interface ControllerLog {
|
||||
|
||||
String value() default "";
|
||||
}
|
||||
@@ -1,17 +0,0 @@
|
||||
package com.xuxd.kafka.console.aspect.annotation;
|
||||
|
||||
import java.lang.annotation.*;
|
||||
|
||||
/**
|
||||
* 权限注解,开启认证的时候拥有该权限的用户才能访问对应接口.
|
||||
*
|
||||
* @author: xuxd
|
||||
* @date: 2023/5/17 22:30
|
||||
**/
|
||||
@Inherited
|
||||
@Retention(RetentionPolicy.RUNTIME)
|
||||
@Target(ElementType.METHOD)
|
||||
public @interface Permission {
|
||||
|
||||
String[] value() default {};
|
||||
}
|
||||
@@ -1,15 +1,18 @@
|
||||
package com.xuxd.kafka.console.beans;
|
||||
|
||||
import java.util.Objects;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.apache.kafka.common.acl.*;
|
||||
import org.apache.kafka.common.acl.AccessControlEntry;
|
||||
import org.apache.kafka.common.acl.AccessControlEntryFilter;
|
||||
import org.apache.kafka.common.acl.AclBinding;
|
||||
import org.apache.kafka.common.acl.AclBindingFilter;
|
||||
import org.apache.kafka.common.acl.AclOperation;
|
||||
import org.apache.kafka.common.acl.AclPermissionType;
|
||||
import org.apache.kafka.common.resource.PatternType;
|
||||
import org.apache.kafka.common.resource.ResourcePattern;
|
||||
import org.apache.kafka.common.resource.ResourcePatternFilter;
|
||||
import org.apache.kafka.common.resource.ResourceType;
|
||||
import org.apache.kafka.common.security.auth.KafkaPrincipal;
|
||||
import org.apache.kafka.common.utils.SecurityUtils;
|
||||
|
||||
import java.util.Objects;
|
||||
|
||||
/**
|
||||
* kafka-console-ui.
|
||||
@@ -38,9 +41,7 @@ public class AclEntry {
|
||||
entry.setResourceType(binding.pattern().resourceType().name());
|
||||
entry.setName(binding.pattern().name());
|
||||
entry.setPatternType(binding.pattern().patternType().name());
|
||||
// entry.setPrincipal(KafkaPrincipal.fromString(binding.entry().principal()).getName());
|
||||
// 3.x版本使用该方法
|
||||
entry.setPrincipal(SecurityUtils.parseKafkaPrincipal(binding.entry().principal()).getName());
|
||||
entry.setPrincipal(KafkaPrincipal.fromString(binding.entry().principal()).getName());
|
||||
entry.setHost(binding.entry().host());
|
||||
entry.setOperation(binding.entry().operation().name());
|
||||
entry.setPermissionType(binding.entry().permissionType().name());
|
||||
|
||||
@@ -1,21 +0,0 @@
|
||||
package com.xuxd.kafka.console.beans;
|
||||
|
||||
import lombok.Data;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/5/14 19:37
|
||||
**/
|
||||
@Data
|
||||
public class Credentials {
|
||||
|
||||
public static final Credentials INVALID = new Credentials();
|
||||
|
||||
private String username;
|
||||
|
||||
private long expiration;
|
||||
|
||||
public boolean isInvalid() {
|
||||
return this == INVALID;
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,7 @@
|
||||
package com.xuxd.kafka.console.beans;
|
||||
|
||||
public class KafkaConsoleException extends RuntimeException{
|
||||
public KafkaConsoleException(String msg){
|
||||
super(msg);
|
||||
}
|
||||
}
|
||||
@@ -1,17 +0,0 @@
|
||||
package com.xuxd.kafka.console.beans;
|
||||
|
||||
import lombok.Data;
|
||||
|
||||
import java.util.List;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/5/14 20:44
|
||||
**/
|
||||
@Data
|
||||
public class LoginResult {
|
||||
|
||||
private String token;
|
||||
|
||||
private List<String> permissions;
|
||||
}
|
||||
@@ -11,7 +11,7 @@ import lombok.Setter;
|
||||
**/
|
||||
public class ResponseData<T> {
|
||||
|
||||
public static final int SUCCESS_CODE = 0, FAILED_CODE = -9999;
|
||||
public static final int SUCCESS_CODE = 0, TOKEN_ILLEGAL = -5000, FAILED_CODE = -9999;
|
||||
|
||||
public static final String SUCCESS_MSG = "success", FAILED_MSG = "failed";
|
||||
|
||||
@@ -58,6 +58,12 @@ public class ResponseData<T> {
|
||||
return this;
|
||||
}
|
||||
|
||||
public ResponseData<T> failed(int code) {
|
||||
this.code = code;
|
||||
this.msg = FAILED_MSG;
|
||||
return this;
|
||||
}
|
||||
|
||||
public ResponseData<T> failed(String msg) {
|
||||
this.code = FAILED_CODE;
|
||||
this.msg = msg;
|
||||
|
||||
@@ -1,22 +0,0 @@
|
||||
package com.xuxd.kafka.console.beans;
|
||||
|
||||
import lombok.Getter;
|
||||
import lombok.Setter;
|
||||
import lombok.ToString;
|
||||
import org.springframework.context.ApplicationEvent;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/5/18 15:49
|
||||
**/
|
||||
@ToString
|
||||
public class RolePermUpdateEvent extends ApplicationEvent {
|
||||
|
||||
@Getter
|
||||
@Setter
|
||||
private boolean reload = false;
|
||||
|
||||
public RolePermUpdateEvent(Object source) {
|
||||
super(source);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,9 @@
|
||||
package com.xuxd.kafka.console.beans.annotation;
|
||||
|
||||
import java.lang.annotation.*;
|
||||
|
||||
@Target({ElementType.TYPE, ElementType.METHOD})
|
||||
@Retention(RetentionPolicy.RUNTIME)
|
||||
@Documented
|
||||
public @interface RequiredAuthorize {
|
||||
}
|
||||
@@ -3,15 +3,14 @@ package com.xuxd.kafka.console.beans.dos;
|
||||
import com.baomidou.mybatisplus.annotation.IdType;
|
||||
import com.baomidou.mybatisplus.annotation.TableId;
|
||||
import com.baomidou.mybatisplus.annotation.TableName;
|
||||
import com.xuxd.kafka.console.beans.enums.Role;
|
||||
import lombok.Data;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/4/11 21:17
|
||||
**/
|
||||
import java.util.Date;
|
||||
|
||||
@Data
|
||||
@TableName("t_sys_user")
|
||||
public class SysUserDO {
|
||||
@TableName("t_devops_user")
|
||||
public class DevOpsUserDO {
|
||||
|
||||
@TableId(type = IdType.AUTO)
|
||||
private Long id;
|
||||
@@ -20,7 +19,11 @@ public class SysUserDO {
|
||||
|
||||
private String password;
|
||||
|
||||
private String salt;
|
||||
private Role role;
|
||||
|
||||
private String roleIds;
|
||||
private boolean delete;
|
||||
|
||||
private Date createTime;
|
||||
|
||||
private Date updateTime;
|
||||
}
|
||||
@@ -1,29 +0,0 @@
|
||||
package com.xuxd.kafka.console.beans.dos;
|
||||
|
||||
import com.baomidou.mybatisplus.annotation.IdType;
|
||||
import com.baomidou.mybatisplus.annotation.TableId;
|
||||
import com.baomidou.mybatisplus.annotation.TableName;
|
||||
import lombok.Data;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/4/11 21:17
|
||||
**/
|
||||
@Data
|
||||
@TableName("t_sys_permission")
|
||||
public class SysPermissionDO {
|
||||
|
||||
@TableId(type = IdType.AUTO)
|
||||
private Long id;
|
||||
|
||||
private String name;
|
||||
|
||||
/**
|
||||
* 权限类型: 0:菜单,1:按钮
|
||||
*/
|
||||
private Integer type;
|
||||
|
||||
private Long parentId;
|
||||
|
||||
private String permission;
|
||||
}
|
||||
@@ -1,24 +0,0 @@
|
||||
package com.xuxd.kafka.console.beans.dos;
|
||||
|
||||
import com.baomidou.mybatisplus.annotation.IdType;
|
||||
import com.baomidou.mybatisplus.annotation.TableId;
|
||||
import com.baomidou.mybatisplus.annotation.TableName;
|
||||
import lombok.Data;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/4/11 21:17
|
||||
**/
|
||||
@Data
|
||||
@TableName("t_sys_role")
|
||||
public class SysRoleDO {
|
||||
|
||||
@TableId(type = IdType.AUTO)
|
||||
private Long id;
|
||||
|
||||
private String roleName;
|
||||
|
||||
private String description;
|
||||
|
||||
private String permissionIds;
|
||||
}
|
||||
@@ -1,27 +0,0 @@
|
||||
package com.xuxd.kafka.console.beans.dto;
|
||||
|
||||
import lombok.Data;
|
||||
|
||||
import java.util.List;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/1/10 20:12
|
||||
**/
|
||||
@Data
|
||||
public class AlterClientQuotaDTO {
|
||||
|
||||
private String type;
|
||||
|
||||
private List<String> types;
|
||||
|
||||
private List<String> names;
|
||||
|
||||
private String consumerRate;
|
||||
|
||||
private String producerRate;
|
||||
|
||||
private String requestPercentage;
|
||||
|
||||
private List<String> deleteConfigs;
|
||||
}
|
||||
@@ -1,15 +0,0 @@
|
||||
package com.xuxd.kafka.console.beans.dto;
|
||||
|
||||
import lombok.Data;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/5/14 18:59
|
||||
**/
|
||||
@Data
|
||||
public class LoginUserDTO {
|
||||
|
||||
private String username;
|
||||
|
||||
private String password;
|
||||
}
|
||||
@@ -1,17 +0,0 @@
|
||||
package com.xuxd.kafka.console.beans.dto;
|
||||
|
||||
import lombok.Data;
|
||||
|
||||
import java.util.List;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/1/9 21:53
|
||||
**/
|
||||
@Data
|
||||
public class QueryClientQuotaDTO {
|
||||
|
||||
private List<String> types;
|
||||
|
||||
private List<String> names;
|
||||
}
|
||||
@@ -1,32 +0,0 @@
|
||||
package com.xuxd.kafka.console.beans.dto;
|
||||
|
||||
import com.xuxd.kafka.console.beans.dos.SysPermissionDO;
|
||||
import lombok.Data;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/4/11 21:17
|
||||
**/
|
||||
@Data
|
||||
public class SysPermissionDTO {
|
||||
|
||||
private String name;
|
||||
|
||||
/**
|
||||
* 权限类型: 0:菜单,1:按钮
|
||||
*/
|
||||
private Integer type;
|
||||
|
||||
private Long parentId;
|
||||
|
||||
private String permission;
|
||||
|
||||
public SysPermissionDO toSysPermissionDO() {
|
||||
SysPermissionDO permissionDO = new SysPermissionDO();
|
||||
permissionDO.setName(this.name);
|
||||
permissionDO.setType(this.type);
|
||||
permissionDO.setParentId(this.parentId);
|
||||
permissionDO.setPermission(this.permission);
|
||||
return permissionDO;
|
||||
}
|
||||
}
|
||||
@@ -1,35 +0,0 @@
|
||||
package com.xuxd.kafka.console.beans.dto;
|
||||
|
||||
import com.xuxd.kafka.console.beans.dos.SysRoleDO;
|
||||
import lombok.Data;
|
||||
import org.apache.commons.collections.CollectionUtils;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
import java.util.List;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/4/11 21:17
|
||||
**/
|
||||
@Data
|
||||
public class SysRoleDTO {
|
||||
|
||||
private Long id;
|
||||
|
||||
private String roleName;
|
||||
|
||||
private String description;
|
||||
|
||||
private List<String> permissionIds;
|
||||
|
||||
public SysRoleDO toDO() {
|
||||
SysRoleDO roleDO = new SysRoleDO();
|
||||
roleDO.setId(this.id);
|
||||
roleDO.setRoleName(this.roleName);
|
||||
roleDO.setDescription(this.description);
|
||||
if (CollectionUtils.isNotEmpty(permissionIds)) {
|
||||
roleDO.setPermissionIds(StringUtils.join(this.permissionIds, ","));
|
||||
}
|
||||
return roleDO;
|
||||
}
|
||||
}
|
||||
@@ -1,34 +0,0 @@
|
||||
package com.xuxd.kafka.console.beans.dto;
|
||||
|
||||
import com.xuxd.kafka.console.beans.dos.SysUserDO;
|
||||
import lombok.Data;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/4/11 21:17
|
||||
**/
|
||||
@Data
|
||||
public class SysUserDTO {
|
||||
|
||||
private Long id;
|
||||
|
||||
private String username;
|
||||
|
||||
private String password;
|
||||
|
||||
private String salt;
|
||||
|
||||
private String roleIds;
|
||||
|
||||
private Boolean resetPassword = false;
|
||||
|
||||
public SysUserDO toDO() {
|
||||
SysUserDO userDO = new SysUserDO();
|
||||
userDO.setId(this.id);
|
||||
userDO.setUsername(this.username);
|
||||
userDO.setPassword(this.password);
|
||||
userDO.setSalt(this.salt);
|
||||
userDO.setRoleIds(this.roleIds);
|
||||
return userDO;
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,11 @@
|
||||
package com.xuxd.kafka.console.beans.dto.user;
|
||||
|
||||
import com.xuxd.kafka.console.beans.enums.Role;
|
||||
import lombok.Data;
|
||||
|
||||
@Data
|
||||
public class AddUserDTO {
|
||||
private String username;
|
||||
private String password;
|
||||
private Role role;
|
||||
}
|
||||
@@ -0,0 +1,9 @@
|
||||
package com.xuxd.kafka.console.beans.dto.user;
|
||||
|
||||
import lombok.Data;
|
||||
|
||||
@Data
|
||||
public class ListUserDTO {
|
||||
private Long id;
|
||||
private String username;
|
||||
}
|
||||
@@ -0,0 +1,9 @@
|
||||
package com.xuxd.kafka.console.beans.dto.user;
|
||||
|
||||
import lombok.Data;
|
||||
|
||||
@Data
|
||||
public class LoginDTO {
|
||||
private String username;
|
||||
private String password;
|
||||
}
|
||||
@@ -0,0 +1,9 @@
|
||||
package com.xuxd.kafka.console.beans.dto.user;
|
||||
|
||||
import lombok.Data;
|
||||
|
||||
@Data
|
||||
public class PasswordDTO {
|
||||
private Long userId;
|
||||
private String password;
|
||||
}
|
||||
@@ -0,0 +1,11 @@
|
||||
package com.xuxd.kafka.console.beans.dto.user;
|
||||
|
||||
import com.xuxd.kafka.console.beans.enums.Role;
|
||||
import lombok.Data;
|
||||
|
||||
@Data
|
||||
public class UpdateUserDTO {
|
||||
private String username;
|
||||
private String password;
|
||||
private Role role;
|
||||
}
|
||||
@@ -0,0 +1,6 @@
|
||||
package com.xuxd.kafka.console.beans.enums;
|
||||
|
||||
public enum Role {
|
||||
developer,
|
||||
manager
|
||||
}
|
||||
@@ -1,82 +0,0 @@
|
||||
package com.xuxd.kafka.console.beans.vo;
|
||||
|
||||
import lombok.Data;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.apache.kafka.common.config.internals.QuotaConfigs;
|
||||
import org.apache.kafka.common.quota.ClientQuotaEntity;
|
||||
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
/**
|
||||
* @author 晓东哥哥
|
||||
*/
|
||||
@Data
|
||||
public class ClientQuotaEntityVO {
|
||||
|
||||
private String user;
|
||||
|
||||
private String client;
|
||||
|
||||
private String ip;
|
||||
|
||||
private String consumerRate;
|
||||
|
||||
private String producerRate;
|
||||
|
||||
private String requestPercentage;
|
||||
|
||||
public static ClientQuotaEntityVO from(ClientQuotaEntity entity, List<String> entityTypes, Map<String, Object> config) {
|
||||
ClientQuotaEntityVO entityVO = new ClientQuotaEntityVO();
|
||||
Map<String, String> entries = entity.entries();
|
||||
entityTypes.forEach(type -> {
|
||||
switch (type) {
|
||||
case ClientQuotaEntity.USER:
|
||||
entityVO.setUser(entries.get(type));
|
||||
break;
|
||||
case ClientQuotaEntity.CLIENT_ID:
|
||||
entityVO.setClient(entries.get(type));
|
||||
break;
|
||||
case ClientQuotaEntity.IP:
|
||||
entityVO.setIp(entries.get(type));
|
||||
break;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
});
|
||||
entityVO.setConsumerRate(convert(config.getOrDefault(QuotaConfigs.CONSUMER_BYTE_RATE_OVERRIDE_CONFIG, "")));
|
||||
entityVO.setProducerRate(convert(config.getOrDefault(QuotaConfigs.PRODUCER_BYTE_RATE_OVERRIDE_CONFIG, "")));
|
||||
entityVO.setRequestPercentage(config.getOrDefault(QuotaConfigs.REQUEST_PERCENTAGE_OVERRIDE_CONFIG, "").toString());
|
||||
return entityVO;
|
||||
}
|
||||
|
||||
|
||||
public static String convert(Object num) {
|
||||
if (num == null) {
|
||||
return null;
|
||||
}
|
||||
|
||||
if (num instanceof String) {
|
||||
if ((StringUtils.isBlank((String) num))) {
|
||||
return (String) num;
|
||||
}
|
||||
}
|
||||
|
||||
if (num instanceof Number) {
|
||||
Number number = (Number) num;
|
||||
double value = number.doubleValue();
|
||||
double _1kb = 1024;
|
||||
double _1mb = 1024 * _1kb;
|
||||
if (value < _1kb) {
|
||||
return value + " Byte";
|
||||
}
|
||||
if (value < _1mb) {
|
||||
return String.format("%.1f KB", (value / _1kb));
|
||||
}
|
||||
if (value >= _1mb) {
|
||||
return String.format("%.1f MB", (value / _1mb));
|
||||
}
|
||||
}
|
||||
return String.valueOf(num);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,16 @@
|
||||
package com.xuxd.kafka.console.beans.vo;
|
||||
|
||||
import com.fasterxml.jackson.annotation.JsonFormat;
|
||||
import com.xuxd.kafka.console.beans.enums.Role;
|
||||
import lombok.Data;
|
||||
|
||||
import java.util.Date;
|
||||
|
||||
@Data
|
||||
public class DevOpsUserVO {
|
||||
private Long id;
|
||||
private String username;
|
||||
private Role role;
|
||||
@JsonFormat(pattern="yyyy-MM-dd HH:mm:ss",timezone="GMT+8")
|
||||
private Date createTime;
|
||||
}
|
||||
16
src/main/java/com/xuxd/kafka/console/beans/vo/LoginVO.java
Normal file
16
src/main/java/com/xuxd/kafka/console/beans/vo/LoginVO.java
Normal file
@@ -0,0 +1,16 @@
|
||||
package com.xuxd.kafka.console.beans.vo;
|
||||
|
||||
import com.xuxd.kafka.console.beans.enums.Role;
|
||||
import lombok.AllArgsConstructor;
|
||||
import lombok.Builder;
|
||||
import lombok.Data;
|
||||
import lombok.NoArgsConstructor;
|
||||
|
||||
@Data
|
||||
@Builder
|
||||
@AllArgsConstructor
|
||||
@NoArgsConstructor
|
||||
public class LoginVO {
|
||||
private String token;
|
||||
private Role role;
|
||||
}
|
||||
@@ -1,43 +0,0 @@
|
||||
package com.xuxd.kafka.console.beans.vo;
|
||||
|
||||
import com.xuxd.kafka.console.beans.dos.SysPermissionDO;
|
||||
import lombok.Data;
|
||||
|
||||
import java.util.List;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/4/17 21:18
|
||||
**/
|
||||
@Data
|
||||
public class SysPermissionVO {
|
||||
|
||||
private Long id;
|
||||
|
||||
private String name;
|
||||
|
||||
/**
|
||||
* 权限类型: 0:菜单,1:按钮
|
||||
*/
|
||||
private Integer type;
|
||||
|
||||
private Long parentId;
|
||||
|
||||
private String permission;
|
||||
|
||||
private Long key;
|
||||
|
||||
private List<SysPermissionVO> children;
|
||||
|
||||
public static SysPermissionVO from(SysPermissionDO permissionDO) {
|
||||
SysPermissionVO permissionVO = new SysPermissionVO();
|
||||
|
||||
permissionVO.setPermission(permissionDO.getPermission());
|
||||
permissionVO.setType(permissionDO.getType());
|
||||
permissionVO.setName(permissionDO.getName());
|
||||
permissionVO.setParentId(permissionDO.getParentId());
|
||||
permissionVO.setKey(permissionDO.getId());
|
||||
permissionVO.setId(permissionDO.getId());
|
||||
return permissionVO;
|
||||
}
|
||||
}
|
||||
@@ -1,38 +0,0 @@
|
||||
package com.xuxd.kafka.console.beans.vo;
|
||||
|
||||
import com.xuxd.kafka.console.beans.dos.SysRoleDO;
|
||||
import lombok.Data;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
|
||||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/4/19 21:12
|
||||
**/
|
||||
@Data
|
||||
public class SysRoleVO {
|
||||
|
||||
private Long id;
|
||||
|
||||
private String roleName;
|
||||
|
||||
private String description;
|
||||
|
||||
private List<Long> permissionIds;
|
||||
|
||||
public static SysRoleVO from(SysRoleDO roleDO) {
|
||||
SysRoleVO roleVO = new SysRoleVO();
|
||||
roleVO.setId(roleDO.getId());
|
||||
roleVO.setRoleName(roleDO.getRoleName());
|
||||
roleVO.setDescription(roleDO.getDescription());
|
||||
if (StringUtils.isNotEmpty(roleDO.getPermissionIds())) {
|
||||
List<Long> list = Arrays.stream(roleDO.getPermissionIds().split(",")).
|
||||
filter(StringUtils::isNotEmpty).map(e -> Long.valueOf(e.trim())).collect(Collectors.toList());
|
||||
roleVO.setPermissionIds(list);
|
||||
}
|
||||
return roleVO;
|
||||
}
|
||||
}
|
||||
@@ -1,31 +0,0 @@
|
||||
package com.xuxd.kafka.console.beans.vo;
|
||||
|
||||
import com.xuxd.kafka.console.beans.dos.SysUserDO;
|
||||
import lombok.Data;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/5/6 13:06
|
||||
**/
|
||||
@Data
|
||||
public class SysUserVO {
|
||||
|
||||
private Long id;
|
||||
|
||||
private String username;
|
||||
|
||||
private String password;
|
||||
|
||||
private String roleIds;
|
||||
|
||||
private String roleNames;
|
||||
|
||||
public static SysUserVO from(SysUserDO userDO) {
|
||||
SysUserVO userVO = new SysUserVO();
|
||||
userVO.setId(userDO.getId());
|
||||
userVO.setUsername(userDO.getUsername());
|
||||
userVO.setRoleIds(userDO.getRoleIds());
|
||||
userVO.setPassword(userDO.getPassword());
|
||||
return userVO;
|
||||
}
|
||||
}
|
||||
@@ -2,15 +2,20 @@ package com.xuxd.kafka.console.boot;
|
||||
|
||||
import com.baomidou.mybatisplus.core.conditions.query.QueryWrapper;
|
||||
import com.xuxd.kafka.console.beans.dos.ClusterInfoDO;
|
||||
import com.xuxd.kafka.console.beans.dos.DevOpsUserDO;
|
||||
import com.xuxd.kafka.console.config.KafkaConfig;
|
||||
import com.xuxd.kafka.console.dao.ClusterInfoMapper;
|
||||
import com.xuxd.kafka.console.dao.DevOpsUserMapper;
|
||||
import com.xuxd.kafka.console.utils.ConvertUtil;
|
||||
import java.util.List;
|
||||
|
||||
import com.xuxd.kafka.console.utils.Md5Utils;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.apache.commons.collections.CollectionUtils;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.springframework.beans.factory.ObjectProvider;
|
||||
import org.springframework.beans.factory.SmartInitializingSingleton;
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.stereotype.Component;
|
||||
|
||||
/**
|
||||
|
||||
@@ -0,0 +1,42 @@
|
||||
package com.xuxd.kafka.console.boot;
|
||||
|
||||
import com.baomidou.mybatisplus.core.conditions.query.QueryWrapper;
|
||||
import com.xuxd.kafka.console.beans.dos.DevOpsUserDO;
|
||||
import com.xuxd.kafka.console.beans.enums.Role;
|
||||
import com.xuxd.kafka.console.dao.DevOpsUserMapper;
|
||||
import com.xuxd.kafka.console.utils.Md5Utils;
|
||||
import lombok.RequiredArgsConstructor;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.springframework.beans.factory.SmartInitializingSingleton;
|
||||
import org.springframework.beans.factory.annotation.Value;
|
||||
import org.springframework.stereotype.Component;
|
||||
|
||||
@Component
|
||||
@Slf4j
|
||||
@RequiredArgsConstructor
|
||||
public class InitSuperDevOpsUser implements SmartInitializingSingleton {
|
||||
|
||||
private final DevOpsUserMapper devOpsUserMapper;
|
||||
public final static String SUPER_USERNAME = "admin";
|
||||
|
||||
@Value("${devops.password:kafka-console-ui521}")
|
||||
private String password;
|
||||
|
||||
@Override
|
||||
public void afterSingletonsInstantiated() {
|
||||
QueryWrapper<DevOpsUserDO> userDOQueryWrapper = new QueryWrapper<>();
|
||||
userDOQueryWrapper.eq("username", SUPER_USERNAME);
|
||||
DevOpsUserDO userDO = devOpsUserMapper.selectOne(userDOQueryWrapper);
|
||||
if (userDO == null){
|
||||
DevOpsUserDO devOpsUserDO = new DevOpsUserDO();
|
||||
devOpsUserDO.setUsername(SUPER_USERNAME);
|
||||
devOpsUserDO.setPassword(Md5Utils.MD5(password));
|
||||
devOpsUserDO.setRole(Role.manager);
|
||||
devOpsUserMapper.insert(devOpsUserDO);
|
||||
} else {
|
||||
userDO.setPassword(Md5Utils.MD5(password));
|
||||
devOpsUserMapper.updateById(userDO);
|
||||
}
|
||||
log.info("init super devops user done, username = {}", SUPER_USERNAME);
|
||||
}
|
||||
}
|
||||
@@ -1,91 +0,0 @@
|
||||
package com.xuxd.kafka.console.cache;
|
||||
|
||||
import com.xuxd.kafka.console.beans.RolePermUpdateEvent;
|
||||
import com.xuxd.kafka.console.beans.dos.SysPermissionDO;
|
||||
import com.xuxd.kafka.console.beans.dos.SysRoleDO;
|
||||
import com.xuxd.kafka.console.dao.SysPermissionMapper;
|
||||
import com.xuxd.kafka.console.dao.SysRoleMapper;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.springframework.beans.factory.SmartInitializingSingleton;
|
||||
import org.springframework.context.ApplicationListener;
|
||||
import org.springframework.context.annotation.DependsOn;
|
||||
import org.springframework.stereotype.Component;
|
||||
|
||||
import java.util.*;
|
||||
import java.util.function.Function;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/5/18 15:47
|
||||
**/
|
||||
@DependsOn("dataInit")
|
||||
@Slf4j
|
||||
@Component
|
||||
public class RolePermCache implements ApplicationListener<RolePermUpdateEvent>, SmartInitializingSingleton {
|
||||
|
||||
private Map<Long, SysPermissionDO> permCache = new HashMap<>();
|
||||
|
||||
private Map<Long, Set<String>> rolePermCache = new HashMap<>();
|
||||
|
||||
private final SysPermissionMapper permissionMapper;
|
||||
|
||||
private final SysRoleMapper roleMapper;
|
||||
|
||||
public RolePermCache(SysPermissionMapper permissionMapper, SysRoleMapper roleMapper) {
|
||||
this.permissionMapper = permissionMapper;
|
||||
this.roleMapper = roleMapper;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void onApplicationEvent(RolePermUpdateEvent event) {
|
||||
log.info("更新角色权限信息:{}", event);
|
||||
if (event.isReload()) {
|
||||
this.loadPermCache();
|
||||
}
|
||||
refresh();
|
||||
}
|
||||
|
||||
public Map<Long, SysPermissionDO> getPermCache() {
|
||||
return permCache;
|
||||
}
|
||||
|
||||
public Map<Long, Set<String>> getRolePermCache() {
|
||||
return rolePermCache;
|
||||
}
|
||||
|
||||
private void refresh() {
|
||||
List<SysRoleDO> roleDOS = roleMapper.selectList(null);
|
||||
Map<Long, Set<String>> tmp = new HashMap<>();
|
||||
for (SysRoleDO roleDO : roleDOS) {
|
||||
String permissionIds = roleDO.getPermissionIds();
|
||||
if (StringUtils.isEmpty(permissionIds)) {
|
||||
continue;
|
||||
}
|
||||
List<Long> list = Arrays.stream(permissionIds.split(",")).map(String::trim).filter(StringUtils::isNotEmpty).map(Long::valueOf).collect(Collectors.toList());
|
||||
Set<String> permSet = tmp.getOrDefault(roleDO.getId(), new HashSet<>());
|
||||
for (Long permId : list) {
|
||||
SysPermissionDO permissionDO = permCache.get(permId);
|
||||
if (permissionDO != null) {
|
||||
permSet.add(permissionDO.getPermission());
|
||||
}
|
||||
}
|
||||
tmp.put(roleDO.getId(), permSet);
|
||||
}
|
||||
rolePermCache = tmp;
|
||||
}
|
||||
|
||||
private void loadPermCache() {
|
||||
List<SysPermissionDO> roleDOS = permissionMapper.selectList(null);
|
||||
Map<Long, SysPermissionDO> map = roleDOS.stream().collect(Collectors.toMap(SysPermissionDO::getId, Function.identity(), (e1, e2) -> e1));
|
||||
permCache = map;
|
||||
}
|
||||
|
||||
|
||||
@Override
|
||||
public void afterSingletonsInstantiated() {
|
||||
this.loadPermCache();
|
||||
this.refresh();
|
||||
}
|
||||
}
|
||||
@@ -1,33 +0,0 @@
|
||||
package com.xuxd.kafka.console.cache;
|
||||
|
||||
import com.google.common.cache.CacheBuilder;
|
||||
import com.google.common.cache.CacheLoader;
|
||||
import com.google.common.cache.LoadingCache;
|
||||
import com.google.common.cache.RemovalListener;
|
||||
import kafka.console.KafkaConsole;
|
||||
|
||||
import java.util.concurrent.ExecutionException;
|
||||
import java.util.concurrent.TimeUnit;
|
||||
|
||||
public class TimeBasedCache<K, V> {
|
||||
private LoadingCache<K, V> cache;
|
||||
|
||||
private KafkaConsole console;
|
||||
|
||||
public TimeBasedCache(CacheLoader<K, V> loader, RemovalListener<K, V> listener) {
|
||||
cache = CacheBuilder.newBuilder()
|
||||
.maximumSize(50) // maximum 100 records can be cached
|
||||
.expireAfterAccess(30, TimeUnit.MINUTES) // cache will expire after 30 minutes of access
|
||||
.removalListener(listener)
|
||||
.build(loader);
|
||||
|
||||
}
|
||||
|
||||
public V get(K k) {
|
||||
try {
|
||||
return cache.get(k);
|
||||
} catch (ExecutionException e) {
|
||||
throw new RuntimeException("Get connection from cache error.", e);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,21 +0,0 @@
|
||||
package com.xuxd.kafka.console.config;
|
||||
|
||||
import lombok.Data;
|
||||
import org.springframework.boot.context.properties.ConfigurationProperties;
|
||||
import org.springframework.context.annotation.Configuration;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/5/9 21:08
|
||||
**/
|
||||
@Data
|
||||
@Configuration
|
||||
@ConfigurationProperties(prefix = "auth")
|
||||
public class AuthConfig {
|
||||
|
||||
private boolean enable;
|
||||
|
||||
private String secret = "kafka-console-ui-default-secret";
|
||||
|
||||
private long expireHours;
|
||||
}
|
||||
@@ -20,12 +20,6 @@ public class KafkaConfig {
|
||||
|
||||
private Properties properties;
|
||||
|
||||
private boolean cacheAdminConnection;
|
||||
|
||||
private boolean cacheProducerConnection;
|
||||
|
||||
private boolean cacheConsumerConnection;
|
||||
|
||||
public String getBootstrapServer() {
|
||||
return bootstrapServer;
|
||||
}
|
||||
@@ -49,28 +43,4 @@ public class KafkaConfig {
|
||||
public void setProperties(Properties properties) {
|
||||
this.properties = properties;
|
||||
}
|
||||
|
||||
public boolean isCacheAdminConnection() {
|
||||
return cacheAdminConnection;
|
||||
}
|
||||
|
||||
public void setCacheAdminConnection(boolean cacheAdminConnection) {
|
||||
this.cacheAdminConnection = cacheAdminConnection;
|
||||
}
|
||||
|
||||
public boolean isCacheProducerConnection() {
|
||||
return cacheProducerConnection;
|
||||
}
|
||||
|
||||
public void setCacheProducerConnection(boolean cacheProducerConnection) {
|
||||
this.cacheProducerConnection = cacheProducerConnection;
|
||||
}
|
||||
|
||||
public boolean isCacheConsumerConnection() {
|
||||
return cacheConsumerConnection;
|
||||
}
|
||||
|
||||
public void setCacheConsumerConnection(boolean cacheConsumerConnection) {
|
||||
this.cacheConsumerConnection = cacheConsumerConnection;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,6 +1,13 @@
|
||||
package com.xuxd.kafka.console.config;
|
||||
|
||||
import kafka.console.*;
|
||||
import kafka.console.ClusterConsole;
|
||||
import kafka.console.ConfigConsole;
|
||||
import kafka.console.ConsumerConsole;
|
||||
import kafka.console.KafkaAclConsole;
|
||||
import kafka.console.KafkaConfigConsole;
|
||||
import kafka.console.MessageConsole;
|
||||
import kafka.console.OperationConsole;
|
||||
import kafka.console.TopicConsole;
|
||||
import org.springframework.context.annotation.Bean;
|
||||
import org.springframework.context.annotation.Configuration;
|
||||
|
||||
@@ -45,7 +52,7 @@ public class KafkaConfiguration {
|
||||
|
||||
@Bean
|
||||
public OperationConsole operationConsole(KafkaConfig config, TopicConsole topicConsole,
|
||||
ConsumerConsole consumerConsole) {
|
||||
ConsumerConsole consumerConsole) {
|
||||
return new OperationConsole(config, topicConsole, consumerConsole);
|
||||
}
|
||||
|
||||
@@ -53,9 +60,4 @@ public class KafkaConfiguration {
|
||||
public MessageConsole messageConsole(KafkaConfig config) {
|
||||
return new MessageConsole(config);
|
||||
}
|
||||
|
||||
@Bean
|
||||
public ClientQuotaConsole clientQuotaConsole(KafkaConfig config) {
|
||||
return new ClientQuotaConsole(config);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
package com.xuxd.kafka.console.controller;
|
||||
|
||||
import com.xuxd.kafka.console.aspect.annotation.Permission;
|
||||
import com.xuxd.kafka.console.beans.AclEntry;
|
||||
import com.xuxd.kafka.console.beans.dto.AddAuthDTO;
|
||||
import com.xuxd.kafka.console.beans.dto.ConsumerAuthDTO;
|
||||
@@ -29,7 +28,6 @@ public class AclAuthController {
|
||||
@Autowired
|
||||
private AclService aclService;
|
||||
|
||||
@Permission({"acl:authority:detail", "acl:sasl-scram:detail"})
|
||||
@PostMapping("/detail")
|
||||
public Object getAclDetailList(@RequestBody QueryAclDTO param) {
|
||||
return aclService.getAclDetailList(param.toEntry());
|
||||
@@ -40,13 +38,11 @@ public class AclAuthController {
|
||||
return aclService.getOperationList();
|
||||
}
|
||||
|
||||
@Permission("acl:authority")
|
||||
@PostMapping("/list")
|
||||
public Object getAclList(@RequestBody QueryAclDTO param) {
|
||||
return aclService.getAclList(param.toEntry());
|
||||
}
|
||||
|
||||
@Permission({"acl:authority:add-principal", "acl:authority:add", "acl:sasl-scram:add-auth"})
|
||||
@PostMapping
|
||||
public Object addAcl(@RequestBody AddAuthDTO param) {
|
||||
return aclService.addAcl(param.toAclEntry());
|
||||
@@ -58,7 +54,6 @@ public class AclAuthController {
|
||||
* @param param entry.topic && entry.username must.
|
||||
* @return
|
||||
*/
|
||||
@Permission({"acl:authority:producer", "acl:sasl-scram:producer"})
|
||||
@PostMapping("/producer")
|
||||
public Object addProducerAcl(@RequestBody ProducerAuthDTO param) {
|
||||
|
||||
@@ -71,7 +66,6 @@ public class AclAuthController {
|
||||
* @param param entry.topic && entry.groupId entry.username must.
|
||||
* @return
|
||||
*/
|
||||
@Permission({"acl:authority:consumer", "acl:sasl-scram:consumer"})
|
||||
@PostMapping("/consumer")
|
||||
public Object addConsumerAcl(@RequestBody ConsumerAuthDTO param) {
|
||||
|
||||
@@ -84,7 +78,6 @@ public class AclAuthController {
|
||||
* @param entry entry
|
||||
* @return
|
||||
*/
|
||||
@Permission({"acl:authority:clean", "acl:sasl-scram:pure"})
|
||||
@DeleteMapping
|
||||
public Object deleteAclByUser(@RequestBody AclEntry entry) {
|
||||
return aclService.deleteAcl(entry);
|
||||
@@ -96,7 +89,6 @@ public class AclAuthController {
|
||||
* @param param entry.username
|
||||
* @return
|
||||
*/
|
||||
@Permission({"acl:authority:clean", "acl:sasl-scram:pure"})
|
||||
@DeleteMapping("/user")
|
||||
public Object deleteAclByUser(@RequestBody DeleteAclDTO param) {
|
||||
return aclService.deleteUserAcl(param.toUserEntry());
|
||||
@@ -108,7 +100,6 @@ public class AclAuthController {
|
||||
* @param param entry.topic && entry.username must.
|
||||
* @return
|
||||
*/
|
||||
@Permission({"acl:authority:clean", "acl:sasl-scram:pure"})
|
||||
@DeleteMapping("/producer")
|
||||
public Object deleteProducerAcl(@RequestBody ProducerAuthDTO param) {
|
||||
|
||||
@@ -121,22 +112,10 @@ public class AclAuthController {
|
||||
* @param param entry.topic && entry.groupId entry.username must.
|
||||
* @return
|
||||
*/
|
||||
@Permission({"acl:authority:clean", "acl:sasl-scram:pure"})
|
||||
@DeleteMapping("/consumer")
|
||||
public Object deleteConsumerAcl(@RequestBody ConsumerAuthDTO param) {
|
||||
|
||||
return aclService.deleteConsumerAcl(param.toTopicEntry(), param.toGroupEntry());
|
||||
}
|
||||
|
||||
/**
|
||||
* clear principal acls.
|
||||
*
|
||||
* @param param acl principal.
|
||||
* @return true or false.
|
||||
*/
|
||||
@Permission({"acl:authority:clean", "acl:sasl-scram:pure"})
|
||||
@DeleteMapping("/clear")
|
||||
public Object clearAcl(@RequestBody DeleteAclDTO param) {
|
||||
return aclService.clearAcl(param.toUserEntry());
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,10 +1,8 @@
|
||||
package com.xuxd.kafka.console.controller;
|
||||
|
||||
import com.xuxd.kafka.console.aspect.annotation.Permission;
|
||||
import com.xuxd.kafka.console.beans.AclEntry;
|
||||
import com.xuxd.kafka.console.beans.AclUser;
|
||||
import com.xuxd.kafka.console.beans.annotation.RequiredAuthorize;
|
||||
import com.xuxd.kafka.console.service.AclService;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.web.bind.annotation.DeleteMapping;
|
||||
import org.springframework.web.bind.annotation.GetMapping;
|
||||
@@ -15,7 +13,7 @@ import org.springframework.web.bind.annotation.RequestParam;
|
||||
import org.springframework.web.bind.annotation.RestController;
|
||||
|
||||
/**
|
||||
* kafka-console-ui. sasl scram user.
|
||||
* kafka-console-ui.
|
||||
*
|
||||
* @author xuxd
|
||||
* @date 2021-08-28 21:13:05
|
||||
@@ -27,41 +25,32 @@ public class AclUserController {
|
||||
@Autowired
|
||||
private AclService aclService;
|
||||
|
||||
@Permission("acl:sasl-scram")
|
||||
@GetMapping
|
||||
public Object getUserList() {
|
||||
return aclService.getUserList();
|
||||
}
|
||||
|
||||
@Permission({"acl:sasl-scram:add-update", "acl:sasl-scram:add-auth"})
|
||||
@PostMapping
|
||||
@RequiredAuthorize
|
||||
public Object addOrUpdateUser(@RequestBody AclUser user) {
|
||||
return aclService.addOrUpdateUser(user.getUsername(), user.getPassword());
|
||||
}
|
||||
|
||||
@Permission({"acl:sasl-scram:del", "acl:sasl-scram:pure"})
|
||||
@DeleteMapping
|
||||
@RequiredAuthorize
|
||||
public Object deleteUser(@RequestBody AclUser user) {
|
||||
return aclService.deleteUser(user.getUsername());
|
||||
}
|
||||
|
||||
|
||||
@Permission({"acl:sasl-scram:del", "acl:sasl-scram:pure"})
|
||||
@DeleteMapping("/auth")
|
||||
@RequiredAuthorize
|
||||
public Object deleteUserAndAuth(@RequestBody AclUser user) {
|
||||
return aclService.deleteUserAndAuth(user.getUsername());
|
||||
}
|
||||
|
||||
@Permission("acl:sasl-scram:detail")
|
||||
@GetMapping("/detail")
|
||||
public Object getUserDetail(@RequestParam String username) {
|
||||
return aclService.getUserDetail(username);
|
||||
}
|
||||
|
||||
@GetMapping("/scram")
|
||||
public Object getSaslScramUserList(@RequestParam(required = false) String username) {
|
||||
AclEntry entry = new AclEntry();
|
||||
entry.setPrincipal(StringUtils.isNotBlank(username) ? username : null);
|
||||
return aclService.getSaslScramUserList(entry);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,36 +0,0 @@
|
||||
package com.xuxd.kafka.console.controller;
|
||||
|
||||
import com.xuxd.kafka.console.beans.ResponseData;
|
||||
import com.xuxd.kafka.console.beans.dto.LoginUserDTO;
|
||||
import com.xuxd.kafka.console.config.AuthConfig;
|
||||
import com.xuxd.kafka.console.service.AuthService;
|
||||
import org.springframework.web.bind.annotation.*;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/5/11 18:54
|
||||
**/
|
||||
@RestController
|
||||
@RequestMapping("/auth")
|
||||
public class AuthController {
|
||||
|
||||
|
||||
private final AuthConfig authConfig;
|
||||
|
||||
private final AuthService authService;
|
||||
|
||||
public AuthController(AuthConfig authConfig, AuthService authService) {
|
||||
this.authConfig = authConfig;
|
||||
this.authService = authService;
|
||||
}
|
||||
|
||||
@GetMapping("/enable")
|
||||
public boolean enable() {
|
||||
return authConfig.isEnable();
|
||||
}
|
||||
|
||||
@PostMapping("/login")
|
||||
public ResponseData login(@RequestBody LoginUserDTO userDTO) {
|
||||
return authService.login(userDTO);
|
||||
}
|
||||
}
|
||||
@@ -1,56 +0,0 @@
|
||||
package com.xuxd.kafka.console.controller;
|
||||
|
||||
import com.xuxd.kafka.console.aspect.annotation.Permission;
|
||||
import com.xuxd.kafka.console.beans.ResponseData;
|
||||
import com.xuxd.kafka.console.beans.dto.AlterClientQuotaDTO;
|
||||
import com.xuxd.kafka.console.beans.dto.QueryClientQuotaDTO;
|
||||
import com.xuxd.kafka.console.service.ClientQuotaService;
|
||||
import org.apache.commons.collections.CollectionUtils;
|
||||
import org.springframework.web.bind.annotation.*;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/1/9 21:50
|
||||
**/
|
||||
@RestController
|
||||
@RequestMapping("/client/quota")
|
||||
public class ClientQuotaController {
|
||||
|
||||
private final ClientQuotaService clientQuotaService;
|
||||
|
||||
public ClientQuotaController(ClientQuotaService clientQuotaService) {
|
||||
this.clientQuotaService = clientQuotaService;
|
||||
}
|
||||
|
||||
@Permission({"quota:user", "quota:client", "quota:user-client"})
|
||||
@PostMapping("/list")
|
||||
public Object getClientQuotaConfigs(@RequestBody QueryClientQuotaDTO request) {
|
||||
return clientQuotaService.getClientQuotaConfigs(request.getTypes(), request.getNames());
|
||||
}
|
||||
|
||||
@Permission({"quota:user:add", "quota:client:add", "quota:user-client:add", "quota:edit"})
|
||||
@PostMapping
|
||||
public Object alterClientQuotaConfigs(@RequestBody AlterClientQuotaDTO request) {
|
||||
if (request.getTypes().size() != 2) {
|
||||
if (CollectionUtils.isEmpty(request.getTypes())
|
||||
|| CollectionUtils.isEmpty(request.getNames())
|
||||
|| request.getTypes().size() != request.getNames().size()) {
|
||||
return ResponseData.create().failed("types length and names length is invalid.");
|
||||
}
|
||||
}
|
||||
return clientQuotaService.alterClientQuotaConfigs(request);
|
||||
}
|
||||
|
||||
@Permission("quota:del")
|
||||
@DeleteMapping
|
||||
public Object deleteClientQuotaConfigs(@RequestBody AlterClientQuotaDTO request) {
|
||||
if (request.getTypes().size() != 2) {
|
||||
if (CollectionUtils.isEmpty(request.getTypes())
|
||||
|| CollectionUtils.isEmpty(request.getNames())
|
||||
|| request.getTypes().size() != request.getNames().size()) {
|
||||
return ResponseData.create().failed("types length and names length is invalid.");
|
||||
}
|
||||
}
|
||||
return clientQuotaService.deleteClientQuotaConfigs(request);
|
||||
}
|
||||
}
|
||||
@@ -1,16 +1,9 @@
|
||||
package com.xuxd.kafka.console.controller;
|
||||
|
||||
import com.xuxd.kafka.console.aspect.annotation.Permission;
|
||||
import com.xuxd.kafka.console.beans.dto.ClusterInfoDTO;
|
||||
import com.xuxd.kafka.console.service.ClusterService;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.web.bind.annotation.DeleteMapping;
|
||||
import org.springframework.web.bind.annotation.GetMapping;
|
||||
import org.springframework.web.bind.annotation.PostMapping;
|
||||
import org.springframework.web.bind.annotation.PutMapping;
|
||||
import org.springframework.web.bind.annotation.RequestBody;
|
||||
import org.springframework.web.bind.annotation.RequestMapping;
|
||||
import org.springframework.web.bind.annotation.RestController;
|
||||
import org.springframework.web.bind.annotation.*;
|
||||
|
||||
/**
|
||||
* kafka-console-ui.
|
||||
@@ -35,19 +28,16 @@ public class ClusterController {
|
||||
return clusterService.getClusterInfoList();
|
||||
}
|
||||
|
||||
@Permission("op:cluster-switch:add")
|
||||
@PostMapping("/info")
|
||||
public Object addClusterInfo(@RequestBody ClusterInfoDTO dto) {
|
||||
return clusterService.addClusterInfo(dto.to());
|
||||
}
|
||||
|
||||
@Permission("op:cluster-switch:del")
|
||||
@DeleteMapping("/info")
|
||||
public Object deleteClusterInfo(@RequestBody ClusterInfoDTO dto) {
|
||||
return clusterService.deleteClusterInfo(dto.getId());
|
||||
}
|
||||
|
||||
@Permission("op:cluster-switch:edit")
|
||||
@PutMapping("/info")
|
||||
public Object updateClusterInfo(@RequestBody ClusterInfoDTO dto) {
|
||||
return clusterService.updateClusterInfo(dto.to());
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
package com.xuxd.kafka.console.controller;
|
||||
|
||||
import com.xuxd.kafka.console.aspect.annotation.Permission;
|
||||
import com.xuxd.kafka.console.beans.ResponseData;
|
||||
import com.xuxd.kafka.console.beans.annotation.RequiredAuthorize;
|
||||
import com.xuxd.kafka.console.beans.dto.AlterConfigDTO;
|
||||
import com.xuxd.kafka.console.beans.enums.AlterType;
|
||||
import com.xuxd.kafka.console.config.KafkaConfig;
|
||||
@@ -42,56 +42,53 @@ public class ConfigController {
|
||||
return ResponseData.create().data(configMap).success();
|
||||
}
|
||||
|
||||
@Permission("topic:property-config")
|
||||
@GetMapping("/topic")
|
||||
public Object getTopicConfig(String topic) {
|
||||
return configService.getTopicConfig(topic);
|
||||
}
|
||||
|
||||
@Permission("topic:property-config:edit")
|
||||
@PostMapping("/topic")
|
||||
@RequiredAuthorize
|
||||
public Object setTopicConfig(@RequestBody AlterConfigDTO dto) {
|
||||
return configService.alterTopicConfig(dto.getEntity(), dto.to(), AlterType.SET);
|
||||
}
|
||||
|
||||
@Permission("topic:property-config:del")
|
||||
@DeleteMapping("/topic")
|
||||
@RequiredAuthorize
|
||||
public Object deleteTopicConfig(@RequestBody AlterConfigDTO dto) {
|
||||
return configService.alterTopicConfig(dto.getEntity(), dto.to(), AlterType.DELETE);
|
||||
}
|
||||
|
||||
@Permission("cluster:property-config")
|
||||
@GetMapping("/broker")
|
||||
public Object getBrokerConfig(String brokerId) {
|
||||
return configService.getBrokerConfig(brokerId);
|
||||
}
|
||||
|
||||
@Permission("cluster:edit")
|
||||
@PostMapping("/broker")
|
||||
@RequiredAuthorize
|
||||
public Object setBrokerConfig(@RequestBody AlterConfigDTO dto) {
|
||||
return configService.alterBrokerConfig(dto.getEntity(), dto.to(), AlterType.SET);
|
||||
}
|
||||
|
||||
@Permission("cluster:edit")
|
||||
@DeleteMapping("/broker")
|
||||
@RequiredAuthorize
|
||||
public Object deleteBrokerConfig(@RequestBody AlterConfigDTO dto) {
|
||||
return configService.alterBrokerConfig(dto.getEntity(), dto.to(), AlterType.DELETE);
|
||||
}
|
||||
|
||||
@Permission("cluster:log-config")
|
||||
@GetMapping("/broker/logger")
|
||||
public Object getBrokerLoggerConfig(String brokerId) {
|
||||
return configService.getBrokerLoggerConfig(brokerId);
|
||||
}
|
||||
|
||||
@Permission("cluster:edit")
|
||||
@PostMapping("/broker/logger")
|
||||
@RequiredAuthorize
|
||||
public Object setBrokerLoggerConfig(@RequestBody AlterConfigDTO dto) {
|
||||
return configService.alterBrokerLoggerConfig(dto.getEntity(), dto.to(), AlterType.SET);
|
||||
}
|
||||
|
||||
@Permission("cluster:edit")
|
||||
@DeleteMapping("/broker/logger")
|
||||
@RequiredAuthorize
|
||||
public Object deleteBrokerLoggerConfig(@RequestBody AlterConfigDTO dto) {
|
||||
return configService.alterBrokerLoggerConfig(dto.getEntity(), dto.to(), AlterType.DELETE);
|
||||
}
|
||||
|
||||
@@ -1,20 +1,29 @@
|
||||
package com.xuxd.kafka.console.controller;
|
||||
|
||||
import com.xuxd.kafka.console.aspect.annotation.Permission;
|
||||
import com.xuxd.kafka.console.beans.ResponseData;
|
||||
import com.xuxd.kafka.console.beans.annotation.RequiredAuthorize;
|
||||
import com.xuxd.kafka.console.beans.dto.AddSubscriptionDTO;
|
||||
import com.xuxd.kafka.console.beans.dto.QueryConsumerGroupDTO;
|
||||
import com.xuxd.kafka.console.beans.dto.ResetOffsetDTO;
|
||||
import com.xuxd.kafka.console.service.ConsumerService;
|
||||
import java.util.Collections;
|
||||
import java.util.HashSet;
|
||||
import java.util.List;
|
||||
import java.util.Objects;
|
||||
import java.util.Set;
|
||||
import org.apache.commons.collections.CollectionUtils;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.apache.kafka.clients.consumer.OffsetResetStrategy;
|
||||
import org.apache.kafka.common.ConsumerGroupState;
|
||||
import org.apache.kafka.common.TopicPartition;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.web.bind.annotation.*;
|
||||
|
||||
import java.util.*;
|
||||
import org.springframework.web.bind.annotation.DeleteMapping;
|
||||
import org.springframework.web.bind.annotation.GetMapping;
|
||||
import org.springframework.web.bind.annotation.PostMapping;
|
||||
import org.springframework.web.bind.annotation.RequestBody;
|
||||
import org.springframework.web.bind.annotation.RequestMapping;
|
||||
import org.springframework.web.bind.annotation.RequestParam;
|
||||
import org.springframework.web.bind.annotation.RestController;
|
||||
|
||||
/**
|
||||
* kafka-console-ui.
|
||||
@@ -43,42 +52,36 @@ public class ConsumerController {
|
||||
return consumerService.getConsumerGroupList(groupIdList, stateSet);
|
||||
}
|
||||
|
||||
@Permission("group:del")
|
||||
@DeleteMapping("/group")
|
||||
public Object deleteConsumerGroup(@RequestParam String groupId) {
|
||||
return consumerService.deleteConsumerGroup(groupId);
|
||||
}
|
||||
|
||||
@Permission("group:client")
|
||||
@GetMapping("/member")
|
||||
public Object getConsumerMembers(@RequestParam String groupId) {
|
||||
return consumerService.getConsumerMembers(groupId);
|
||||
}
|
||||
|
||||
@Permission("group:consumer-detail")
|
||||
@GetMapping("/detail")
|
||||
public Object getConsumerDetail(@RequestParam String groupId) {
|
||||
return consumerService.getConsumerDetail(groupId);
|
||||
}
|
||||
|
||||
@Permission("group:add")
|
||||
@PostMapping("/subscription")
|
||||
@RequiredAuthorize
|
||||
public Object addSubscription(@RequestBody AddSubscriptionDTO subscriptionDTO) {
|
||||
return consumerService.addSubscription(subscriptionDTO.getGroupId(), subscriptionDTO.getTopic());
|
||||
}
|
||||
|
||||
@Permission({"group:consumer-detail:min",
|
||||
"group:consumer-detail:last",
|
||||
"group:consumer-detail:timestamp",
|
||||
"group:consumer-detail:any"})
|
||||
@PostMapping("/reset/offset")
|
||||
@RequiredAuthorize
|
||||
public Object restOffset(@RequestBody ResetOffsetDTO offsetDTO) {
|
||||
ResponseData res = ResponseData.create().failed("unknown");
|
||||
switch (offsetDTO.getLevel()) {
|
||||
case ResetOffsetDTO.Level.TOPIC:
|
||||
switch (offsetDTO.getType()) {
|
||||
case ResetOffsetDTO.Type
|
||||
.EARLIEST:
|
||||
.EARLIEST:
|
||||
res = consumerService.resetOffsetToEndpoint(offsetDTO.getGroupId(), offsetDTO.getTopic(), OffsetResetStrategy.EARLIEST);
|
||||
break;
|
||||
case ResetOffsetDTO.Type.LATEST:
|
||||
@@ -94,7 +97,7 @@ public class ConsumerController {
|
||||
case ResetOffsetDTO.Level.PARTITION:
|
||||
switch (offsetDTO.getType()) {
|
||||
case ResetOffsetDTO.Type
|
||||
.SPECIAL:
|
||||
.SPECIAL:
|
||||
res = consumerService.resetPartitionToTargetOffset(offsetDTO.getGroupId(), new TopicPartition(offsetDTO.getTopic(), offsetDTO.getPartition()), offsetDTO.getOffset());
|
||||
break;
|
||||
default:
|
||||
@@ -118,13 +121,11 @@ public class ConsumerController {
|
||||
return consumerService.getSubscribeTopicList(groupId);
|
||||
}
|
||||
|
||||
@Permission({"topic:consumer-detail"})
|
||||
@GetMapping("/topic/subscribed")
|
||||
public Object getTopicSubscribedByGroups(@RequestParam String topic) {
|
||||
return consumerService.getTopicSubscribedByGroups(topic);
|
||||
}
|
||||
|
||||
@Permission("group:offset-partition")
|
||||
@GetMapping("/offset/partition")
|
||||
public Object getOffsetPartition(@RequestParam String groupId) {
|
||||
return consumerService.getOffsetPartition(groupId);
|
||||
|
||||
@@ -0,0 +1,53 @@
|
||||
package com.xuxd.kafka.console.controller;
|
||||
|
||||
import com.xuxd.kafka.console.beans.ResponseData;
|
||||
import com.xuxd.kafka.console.beans.annotation.RequiredAuthorize;
|
||||
import com.xuxd.kafka.console.beans.dto.user.*;
|
||||
import com.xuxd.kafka.console.beans.vo.DevOpsUserVO;
|
||||
import com.xuxd.kafka.console.beans.vo.LoginVO;
|
||||
import com.xuxd.kafka.console.service.DevOpsUserService;
|
||||
import lombok.RequiredArgsConstructor;
|
||||
import org.springframework.web.bind.annotation.*;
|
||||
|
||||
import java.util.List;
|
||||
|
||||
/**
|
||||
* 用户管理
|
||||
* @author dongyinuo
|
||||
*/
|
||||
@RestController
|
||||
@RequiredArgsConstructor
|
||||
@RequestMapping("/devops/user")
|
||||
public class DevOpsUserController {
|
||||
|
||||
private final DevOpsUserService devOpsUserService;
|
||||
|
||||
@PostMapping("add")
|
||||
@RequiredAuthorize
|
||||
public ResponseData<Boolean> add(@RequestBody AddUserDTO addUserDTO){
|
||||
return devOpsUserService.add(addUserDTO);
|
||||
}
|
||||
|
||||
@PostMapping("update")
|
||||
@RequiredAuthorize
|
||||
public ResponseData<Boolean> update(@RequestBody UpdateUserDTO updateUserDTO){
|
||||
return devOpsUserService.update(updateUserDTO);
|
||||
}
|
||||
|
||||
@DeleteMapping
|
||||
@RequiredAuthorize
|
||||
public ResponseData<Boolean> delete(@RequestParam Long id){
|
||||
return devOpsUserService.delete(id);
|
||||
}
|
||||
|
||||
@GetMapping("list")
|
||||
@RequiredAuthorize
|
||||
public ResponseData<List<DevOpsUserVO>> list(@ModelAttribute ListUserDTO listUserDTO){
|
||||
return devOpsUserService.list(listUserDTO);
|
||||
}
|
||||
|
||||
@PostMapping("login")
|
||||
public ResponseData<LoginVO> login(@RequestBody LoginDTO loginDTO){
|
||||
return devOpsUserService.login(loginDTO.getUsername(), loginDTO.getPassword());
|
||||
}
|
||||
}
|
||||
@@ -1,16 +1,14 @@
|
||||
package com.xuxd.kafka.console.controller;
|
||||
|
||||
import com.xuxd.kafka.console.aspect.annotation.Permission;
|
||||
import com.xuxd.kafka.console.beans.QueryMessage;
|
||||
import com.xuxd.kafka.console.beans.ResponseData;
|
||||
import com.xuxd.kafka.console.beans.SendMessage;
|
||||
import com.xuxd.kafka.console.beans.dto.QueryMessageDTO;
|
||||
import com.xuxd.kafka.console.service.MessageService;
|
||||
import org.apache.commons.collections.CollectionUtils;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.web.bind.annotation.*;
|
||||
|
||||
import java.util.List;
|
||||
import org.springframework.web.bind.annotation.GetMapping;
|
||||
import org.springframework.web.bind.annotation.PostMapping;
|
||||
import org.springframework.web.bind.annotation.RequestBody;
|
||||
import org.springframework.web.bind.annotation.RequestMapping;
|
||||
import org.springframework.web.bind.annotation.RestController;
|
||||
|
||||
/**
|
||||
* kafka-console-ui.
|
||||
@@ -25,19 +23,16 @@ public class MessageController {
|
||||
@Autowired
|
||||
private MessageService messageService;
|
||||
|
||||
@Permission("message:search-time")
|
||||
@PostMapping("/search/time")
|
||||
public Object searchByTime(@RequestBody QueryMessageDTO dto) {
|
||||
return messageService.searchByTime(dto.toQueryMessage());
|
||||
}
|
||||
|
||||
@Permission("message:search-offset")
|
||||
@PostMapping("/search/offset")
|
||||
public Object searchByOffset(@RequestBody QueryMessageDTO dto) {
|
||||
return messageService.searchByOffset(dto.toQueryMessage());
|
||||
}
|
||||
|
||||
@Permission("message:detail")
|
||||
@PostMapping("/search/detail")
|
||||
public Object searchDetail(@RequestBody QueryMessageDTO dto) {
|
||||
return messageService.searchDetail(dto.toQueryMessage());
|
||||
@@ -48,24 +43,13 @@ public class MessageController {
|
||||
return messageService.deserializerList();
|
||||
}
|
||||
|
||||
@Permission("message:send")
|
||||
@PostMapping("/send")
|
||||
public Object send(@RequestBody SendMessage message) {
|
||||
return messageService.send(message);
|
||||
}
|
||||
|
||||
@Permission("message:resend")
|
||||
@PostMapping("/resend")
|
||||
public Object resend(@RequestBody SendMessage message) {
|
||||
return messageService.resend(message);
|
||||
}
|
||||
|
||||
@Permission("message:del")
|
||||
@DeleteMapping
|
||||
public Object delete(@RequestBody List<QueryMessage> messages) {
|
||||
if (CollectionUtils.isEmpty(messages)) {
|
||||
return ResponseData.create().failed("params is null");
|
||||
}
|
||||
return messageService.delete(messages);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
package com.xuxd.kafka.console.controller;
|
||||
|
||||
import com.xuxd.kafka.console.aspect.annotation.Permission;
|
||||
import com.xuxd.kafka.console.beans.TopicPartition;
|
||||
import com.xuxd.kafka.console.beans.annotation.RequiredAuthorize;
|
||||
import com.xuxd.kafka.console.beans.dto.BrokerThrottleDTO;
|
||||
import com.xuxd.kafka.console.beans.dto.ProposedAssignmentDTO;
|
||||
import com.xuxd.kafka.console.beans.dto.ReplicationDTO;
|
||||
@@ -9,7 +9,13 @@ import com.xuxd.kafka.console.beans.dto.SyncDataDTO;
|
||||
import com.xuxd.kafka.console.service.OperationService;
|
||||
import org.apache.kafka.clients.admin.AdminClientConfig;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.web.bind.annotation.*;
|
||||
import org.springframework.web.bind.annotation.DeleteMapping;
|
||||
import org.springframework.web.bind.annotation.GetMapping;
|
||||
import org.springframework.web.bind.annotation.PostMapping;
|
||||
import org.springframework.web.bind.annotation.RequestBody;
|
||||
import org.springframework.web.bind.annotation.RequestMapping;
|
||||
import org.springframework.web.bind.annotation.RequestParam;
|
||||
import org.springframework.web.bind.annotation.RestController;
|
||||
|
||||
/**
|
||||
* kafka-console-ui.
|
||||
@@ -25,12 +31,14 @@ public class OperationController {
|
||||
private OperationService operationService;
|
||||
|
||||
@PostMapping("/sync/consumer/offset")
|
||||
@RequiredAuthorize
|
||||
public Object syncConsumerOffset(@RequestBody SyncDataDTO dto) {
|
||||
dto.getProperties().put(AdminClientConfig.BOOTSTRAP_SERVERS_CONFIG, dto.getAddress());
|
||||
return operationService.syncConsumerOffset(dto.getGroupId(), dto.getTopic(), dto.getProperties());
|
||||
}
|
||||
|
||||
@PostMapping("/sync/min/offset/alignment")
|
||||
@RequiredAuthorize
|
||||
public Object minOffsetAlignment(@RequestBody SyncDataDTO dto) {
|
||||
dto.getProperties().put(AdminClientConfig.BOOTSTRAP_SERVERS_CONFIG, dto.getAddress());
|
||||
return operationService.minOffsetAlignment(dto.getGroupId(), dto.getTopic(), dto.getProperties());
|
||||
@@ -42,41 +50,39 @@ public class OperationController {
|
||||
}
|
||||
|
||||
@DeleteMapping("/sync/alignment")
|
||||
@RequiredAuthorize
|
||||
public Object deleteAlignment(@RequestParam Long id) {
|
||||
return operationService.deleteAlignmentById(id);
|
||||
}
|
||||
|
||||
@Permission({"topic:partition-detail:preferred", "op:replication-preferred"})
|
||||
@PostMapping("/replication/preferred")
|
||||
public Object electPreferredLeader(@RequestBody ReplicationDTO dto) {
|
||||
return operationService.electPreferredLeader(dto.getTopic(), dto.getPartition());
|
||||
}
|
||||
|
||||
@Permission("op:config-throttle")
|
||||
@PostMapping("/broker/throttle")
|
||||
@RequiredAuthorize
|
||||
public Object configThrottle(@RequestBody BrokerThrottleDTO dto) {
|
||||
return operationService.configThrottle(dto.getBrokerList(), dto.getUnit().toKb(dto.getThrottle()));
|
||||
}
|
||||
|
||||
@Permission("op:remove-throttle")
|
||||
@DeleteMapping("/broker/throttle")
|
||||
public Object removeThrottle(@RequestBody BrokerThrottleDTO dto) {
|
||||
return operationService.removeThrottle(dto.getBrokerList());
|
||||
}
|
||||
|
||||
@Permission("op:replication-update-detail")
|
||||
@GetMapping("/replication/reassignments")
|
||||
public Object currentReassignments() {
|
||||
return operationService.currentReassignments();
|
||||
}
|
||||
|
||||
@Permission("op:replication-update-detail:cancel")
|
||||
@DeleteMapping("/replication/reassignments")
|
||||
public Object cancelReassignment(@RequestBody TopicPartition partition) {
|
||||
return operationService.cancelReassignment(new org.apache.kafka.common.TopicPartition(partition.getTopic(), partition.getPartition()));
|
||||
}
|
||||
|
||||
@PostMapping("/replication/reassignments/proposed")
|
||||
@RequiredAuthorize
|
||||
public Object proposedAssignments(@RequestBody ProposedAssignmentDTO dto) {
|
||||
return operationService.proposedAssignments(dto.getTopic(), dto.getBrokers());
|
||||
}
|
||||
|
||||
@@ -1,19 +1,24 @@
|
||||
package com.xuxd.kafka.console.controller;
|
||||
|
||||
import com.xuxd.kafka.console.aspect.annotation.Permission;
|
||||
import com.xuxd.kafka.console.beans.ReplicaAssignment;
|
||||
import com.xuxd.kafka.console.beans.annotation.RequiredAuthorize;
|
||||
import com.xuxd.kafka.console.beans.dto.AddPartitionDTO;
|
||||
import com.xuxd.kafka.console.beans.dto.NewTopicDTO;
|
||||
import com.xuxd.kafka.console.beans.dto.TopicThrottleDTO;
|
||||
import com.xuxd.kafka.console.beans.enums.TopicType;
|
||||
import com.xuxd.kafka.console.service.TopicService;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.web.bind.annotation.*;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Collections;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.web.bind.annotation.DeleteMapping;
|
||||
import org.springframework.web.bind.annotation.GetMapping;
|
||||
import org.springframework.web.bind.annotation.PostMapping;
|
||||
import org.springframework.web.bind.annotation.RequestBody;
|
||||
import org.springframework.web.bind.annotation.RequestMapping;
|
||||
import org.springframework.web.bind.annotation.RequestParam;
|
||||
import org.springframework.web.bind.annotation.RestController;
|
||||
|
||||
/**
|
||||
* kafka-console-ui.
|
||||
@@ -33,32 +38,30 @@ public class TopicController {
|
||||
return topicService.getTopicNameList(false);
|
||||
}
|
||||
|
||||
@Permission("topic:load")
|
||||
@GetMapping("/list")
|
||||
public Object getTopicList(@RequestParam(required = false) String topic, @RequestParam String type) {
|
||||
return topicService.getTopicList(topic, TopicType.valueOf(type.toUpperCase()));
|
||||
}
|
||||
|
||||
@Permission({"topic:batch-del", "topic:del"})
|
||||
@DeleteMapping
|
||||
public Object deleteTopic(@RequestBody List<String> topics) {
|
||||
return topicService.deleteTopics(topics);
|
||||
@RequiredAuthorize
|
||||
public Object deleteTopic(@RequestParam String topic) {
|
||||
return topicService.deleteTopic(topic);
|
||||
}
|
||||
|
||||
@Permission("topic:partition-detail")
|
||||
@GetMapping("/partition")
|
||||
public Object getTopicPartitionInfo(@RequestParam String topic) {
|
||||
return topicService.getTopicPartitionInfo(topic.trim());
|
||||
}
|
||||
|
||||
@Permission("topic:add")
|
||||
@PostMapping("/new")
|
||||
@RequiredAuthorize
|
||||
public Object createNewTopic(@RequestBody NewTopicDTO topicDTO) {
|
||||
return topicService.createTopic(topicDTO.toNewTopic());
|
||||
}
|
||||
|
||||
@Permission("topic:partition-add")
|
||||
@PostMapping("/partition/new")
|
||||
@RequiredAuthorize
|
||||
public Object addPartition(@RequestBody AddPartitionDTO partitionDTO) {
|
||||
String topic = partitionDTO.getTopic().trim();
|
||||
int addNum = partitionDTO.getAddNum();
|
||||
@@ -80,19 +83,18 @@ public class TopicController {
|
||||
return topicService.getCurrentReplicaAssignment(topic);
|
||||
}
|
||||
|
||||
@Permission({"topic:replication-modify", "op:replication-reassign"})
|
||||
@PostMapping("/replica/assignment")
|
||||
@RequiredAuthorize
|
||||
public Object updateReplicaAssignment(@RequestBody ReplicaAssignment assignment) {
|
||||
return topicService.updateReplicaAssignment(assignment);
|
||||
}
|
||||
|
||||
@Permission("topic:replication-sync-throttle")
|
||||
@PostMapping("/replica/throttle")
|
||||
@RequiredAuthorize
|
||||
public Object configThrottle(@RequestBody TopicThrottleDTO dto) {
|
||||
return topicService.configThrottle(dto.getTopic(), dto.getPartitions(), dto.getOperation());
|
||||
}
|
||||
|
||||
@Permission("topic:send-count")
|
||||
@GetMapping("/send/stats")
|
||||
public Object sendStats(@RequestParam String topic) {
|
||||
return topicService.sendStats(topic);
|
||||
|
||||
@@ -1,97 +0,0 @@
|
||||
package com.xuxd.kafka.console.controller;
|
||||
|
||||
import com.xuxd.kafka.console.aspect.annotation.ControllerLog;
|
||||
import com.xuxd.kafka.console.aspect.annotation.Permission;
|
||||
import com.xuxd.kafka.console.beans.Credentials;
|
||||
import com.xuxd.kafka.console.beans.dto.SysPermissionDTO;
|
||||
import com.xuxd.kafka.console.beans.dto.SysRoleDTO;
|
||||
import com.xuxd.kafka.console.beans.dto.SysUserDTO;
|
||||
import com.xuxd.kafka.console.service.UserManageService;
|
||||
import org.springframework.web.bind.annotation.*;
|
||||
|
||||
import javax.servlet.http.HttpServletRequest;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/4/11 21:34
|
||||
**/
|
||||
@RestController
|
||||
@RequestMapping("/sys/user/manage")
|
||||
public class UserManageController {
|
||||
|
||||
private final UserManageService userManageService;
|
||||
|
||||
public UserManageController(UserManageService userManageService) {
|
||||
this.userManageService = userManageService;
|
||||
}
|
||||
|
||||
@Permission({"user-manage:user:add", "user-manage:user:change-role", "user-manage:user:reset-pass"})
|
||||
@ControllerLog("新增/更新用户")
|
||||
@PostMapping("/user")
|
||||
public Object addOrUpdateUser(@RequestBody SysUserDTO userDTO) {
|
||||
return userManageService.addOrUpdateUser(userDTO);
|
||||
}
|
||||
|
||||
@Permission("user-manage:role:save")
|
||||
@ControllerLog("新增/更新角色")
|
||||
@PostMapping("/role")
|
||||
public Object addOrUpdateRole(@RequestBody SysRoleDTO roleDTO) {
|
||||
return userManageService.addOrUdpateRole(roleDTO);
|
||||
}
|
||||
|
||||
@ControllerLog("新增权限")
|
||||
@PostMapping("/permission")
|
||||
public Object addPermission(@RequestBody SysPermissionDTO permissionDTO) {
|
||||
return userManageService.addPermission(permissionDTO);
|
||||
}
|
||||
|
||||
@Permission("user-manage:role:save")
|
||||
@ControllerLog("更新角色")
|
||||
@PutMapping("/role")
|
||||
public Object updateRole(@RequestBody SysRoleDTO roleDTO) {
|
||||
return userManageService.updateRole(roleDTO);
|
||||
}
|
||||
|
||||
@Permission({"user-manage:role"})
|
||||
@GetMapping("/role")
|
||||
public Object selectRole() {
|
||||
return userManageService.selectRole();
|
||||
}
|
||||
|
||||
@Permission({"user-manage:permission"})
|
||||
@GetMapping("/permission")
|
||||
public Object selectPermission() {
|
||||
return userManageService.selectPermission();
|
||||
}
|
||||
|
||||
@Permission({"user-manage:user"})
|
||||
@GetMapping("/user")
|
||||
public Object selectUser() {
|
||||
return userManageService.selectUser();
|
||||
}
|
||||
|
||||
@Permission("user-manage:role:del")
|
||||
@ControllerLog("删除角色")
|
||||
@DeleteMapping("/role")
|
||||
public Object deleteRole(@RequestParam Long id) {
|
||||
return userManageService.deleteRole(id);
|
||||
}
|
||||
|
||||
@Permission("user-manage:user:del")
|
||||
@ControllerLog("删除用户")
|
||||
@DeleteMapping("/user")
|
||||
public Object deleteUser(@RequestParam Long id) {
|
||||
return userManageService.deleteUser(id);
|
||||
}
|
||||
|
||||
@Permission("user-manage:setting")
|
||||
@ControllerLog("更新密码")
|
||||
@PostMapping("/user/password")
|
||||
public Object updatePassword(@RequestBody SysUserDTO userDTO, HttpServletRequest request) {
|
||||
Credentials credentials = (Credentials) request.getAttribute("credentials");
|
||||
if (credentials != null && !credentials.isInvalid()) {
|
||||
userDTO.setUsername(credentials.getUsername());
|
||||
}
|
||||
return userManageService.updatePassword(userDTO);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,7 @@
|
||||
package com.xuxd.kafka.console.dao;
|
||||
|
||||
import com.baomidou.mybatisplus.core.mapper.BaseMapper;
|
||||
import com.xuxd.kafka.console.beans.dos.DevOpsUserDO;
|
||||
|
||||
public interface DevOpsUserMapper extends BaseMapper<DevOpsUserDO> {
|
||||
}
|
||||
@@ -1,15 +0,0 @@
|
||||
package com.xuxd.kafka.console.dao;
|
||||
|
||||
import com.baomidou.mybatisplus.core.mapper.BaseMapper;
|
||||
import com.xuxd.kafka.console.beans.dos.SysPermissionDO;
|
||||
import org.apache.ibatis.annotations.Mapper;
|
||||
|
||||
/**
|
||||
* 系统权限 .
|
||||
*
|
||||
* @author: xuxd
|
||||
* @date: 2023/4/11 21:21
|
||||
**/
|
||||
@Mapper
|
||||
public interface SysPermissionMapper extends BaseMapper<SysPermissionDO> {
|
||||
}
|
||||
@@ -1,13 +0,0 @@
|
||||
package com.xuxd.kafka.console.dao;
|
||||
|
||||
import com.baomidou.mybatisplus.core.mapper.BaseMapper;
|
||||
import com.xuxd.kafka.console.beans.dos.SysRoleDO;
|
||||
import org.apache.ibatis.annotations.Mapper;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/4/11 21:22
|
||||
**/
|
||||
@Mapper
|
||||
public interface SysRoleMapper extends BaseMapper<SysRoleDO> {
|
||||
}
|
||||
@@ -1,13 +0,0 @@
|
||||
package com.xuxd.kafka.console.dao;
|
||||
|
||||
import com.baomidou.mybatisplus.core.mapper.BaseMapper;
|
||||
import com.xuxd.kafka.console.beans.dos.SysUserDO;
|
||||
import org.apache.ibatis.annotations.Mapper;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/4/11 21:22
|
||||
**/
|
||||
@Mapper
|
||||
public interface SysUserMapper extends BaseMapper<SysUserDO> {
|
||||
}
|
||||
@@ -1,93 +0,0 @@
|
||||
package com.xuxd.kafka.console.dao.init;
|
||||
|
||||
import com.xuxd.kafka.console.beans.RolePermUpdateEvent;
|
||||
import com.xuxd.kafka.console.config.AuthConfig;
|
||||
import com.xuxd.kafka.console.dao.SysPermissionMapper;
|
||||
import com.xuxd.kafka.console.dao.SysRoleMapper;
|
||||
import com.xuxd.kafka.console.dao.SysUserMapper;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.springframework.beans.factory.SmartInitializingSingleton;
|
||||
import org.springframework.context.ApplicationEventPublisher;
|
||||
import org.springframework.stereotype.Component;
|
||||
|
||||
import javax.sql.DataSource;
|
||||
import java.sql.Connection;
|
||||
import java.sql.PreparedStatement;
|
||||
import java.sql.SQLException;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/5/17 13:10
|
||||
**/
|
||||
@Slf4j
|
||||
@Component
|
||||
public class DataInit implements SmartInitializingSingleton {
|
||||
|
||||
private final AuthConfig authConfig;
|
||||
|
||||
private final SysUserMapper userMapper;
|
||||
|
||||
private final SysRoleMapper roleMapper;
|
||||
|
||||
private final SysPermissionMapper permissionMapper;
|
||||
|
||||
private final DataSource dataSource;
|
||||
|
||||
private final SqlParse sqlParse;
|
||||
|
||||
private final ApplicationEventPublisher publisher;
|
||||
|
||||
|
||||
public DataInit(AuthConfig authConfig,
|
||||
SysUserMapper userMapper,
|
||||
SysRoleMapper roleMapper,
|
||||
SysPermissionMapper permissionMapper,
|
||||
DataSource dataSource,
|
||||
ApplicationEventPublisher publisher) {
|
||||
this.authConfig = authConfig;
|
||||
this.userMapper = userMapper;
|
||||
this.roleMapper = roleMapper;
|
||||
this.permissionMapper = permissionMapper;
|
||||
this.dataSource = dataSource;
|
||||
this.publisher = publisher;
|
||||
this.sqlParse = new SqlParse();
|
||||
}
|
||||
|
||||
@Override
|
||||
public void afterSingletonsInstantiated() {
|
||||
if (!authConfig.isEnable()) {
|
||||
log.info("Disable login authentication, no longer try to initialize the data");
|
||||
return;
|
||||
}
|
||||
try {
|
||||
Connection connection = dataSource.getConnection();
|
||||
Integer userCount = userMapper.selectCount(null);
|
||||
if (userCount == null || userCount == 0) {
|
||||
initData(connection, SqlParse.USER_TABLE);
|
||||
}
|
||||
|
||||
Integer roleCount = roleMapper.selectCount(null);
|
||||
if (roleCount == null || roleCount == 0) {
|
||||
initData(connection, SqlParse.ROLE_TABLE);
|
||||
}
|
||||
|
||||
Integer permCount = permissionMapper.selectCount(null);
|
||||
if (permCount == null || permCount == 0) {
|
||||
initData(connection, SqlParse.PERM_TABLE);
|
||||
}
|
||||
RolePermUpdateEvent event = new RolePermUpdateEvent(this);
|
||||
event.setReload(true);
|
||||
publisher.publishEvent(event);
|
||||
} catch (SQLException e) {
|
||||
throw new RuntimeException(e);
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
private void initData(Connection connection, String table) throws SQLException {
|
||||
log.info("Init default data for {}", table);
|
||||
String sql = sqlParse.getMergeSql(table);
|
||||
PreparedStatement statement = connection.prepareStatement(sql);
|
||||
statement.execute();
|
||||
}
|
||||
}
|
||||
@@ -1,85 +0,0 @@
|
||||
package com.xuxd.kafka.console.dao.init;
|
||||
|
||||
import com.google.common.io.Files;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.springframework.util.ResourceUtils;
|
||||
import scala.collection.mutable.StringBuilder;
|
||||
|
||||
import java.io.File;
|
||||
import java.io.FileNotFoundException;
|
||||
import java.io.IOException;
|
||||
import java.nio.charset.Charset;
|
||||
import java.util.ArrayList;
|
||||
import java.util.HashMap;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/5/17 21:22
|
||||
**/
|
||||
@Slf4j
|
||||
public class SqlParse {
|
||||
|
||||
private final String FILE = "classpath:db/data-h2.sql";
|
||||
|
||||
private final Map<String, List<String>> sqlMap = new HashMap<>();
|
||||
|
||||
public static final String ROLE_TABLE = "t_sys_role";
|
||||
public static final String USER_TABLE = "t_sys_user";
|
||||
public static final String PERM_TABLE = "t_sys_permission";
|
||||
|
||||
public SqlParse() {
|
||||
sqlMap.put(ROLE_TABLE, new ArrayList<>());
|
||||
sqlMap.put(USER_TABLE, new ArrayList<>());
|
||||
sqlMap.put(PERM_TABLE, new ArrayList<>());
|
||||
|
||||
String table = null;
|
||||
try {
|
||||
File file = ResourceUtils.getFile(FILE);
|
||||
List<String> lines = Files.readLines(file, Charset.forName("UTF-8"));
|
||||
for (String str : lines) {
|
||||
if (StringUtils.isNotEmpty(str)) {
|
||||
if (str.indexOf("start--") > 0) {
|
||||
if (str.indexOf(ROLE_TABLE) > 0) {
|
||||
table = ROLE_TABLE;
|
||||
}
|
||||
if (str.indexOf(USER_TABLE) > 0) {
|
||||
table = USER_TABLE;
|
||||
}
|
||||
if (str.indexOf(PERM_TABLE) > 0) {
|
||||
table = PERM_TABLE;
|
||||
}
|
||||
}
|
||||
if (isSql(str)) {
|
||||
if (table == null) {
|
||||
log.error("Table is null, can not load sql: {}", str);
|
||||
continue;
|
||||
}
|
||||
sqlMap.get(table).add(str);
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (FileNotFoundException e) {
|
||||
throw new RuntimeException(e);
|
||||
} catch (IOException e) {
|
||||
throw new RuntimeException(e);
|
||||
}
|
||||
}
|
||||
|
||||
public List<String> getSqlList(String table) {
|
||||
return sqlMap.get(table);
|
||||
}
|
||||
|
||||
public String getMergeSql(String table) {
|
||||
List<String> list = getSqlList(table);
|
||||
StringBuilder sb = new StringBuilder();
|
||||
list.forEach(sql -> sb.append(sql));
|
||||
return sb.toString();
|
||||
}
|
||||
|
||||
private boolean isSql(String str) {
|
||||
return StringUtils.isNotEmpty(str) && str.startsWith("insert");
|
||||
}
|
||||
}
|
||||
@@ -1,38 +0,0 @@
|
||||
package com.xuxd.kafka.console.exception;
|
||||
|
||||
import com.xuxd.kafka.console.beans.ResponseData;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.springframework.http.HttpStatus;
|
||||
import org.springframework.web.bind.annotation.ControllerAdvice;
|
||||
import org.springframework.web.bind.annotation.ExceptionHandler;
|
||||
import org.springframework.web.bind.annotation.ResponseBody;
|
||||
import org.springframework.web.bind.annotation.ResponseStatus;
|
||||
|
||||
import javax.servlet.http.HttpServletRequest;
|
||||
|
||||
/**
|
||||
* kafka-console-ui.
|
||||
*
|
||||
* @author xuxd
|
||||
* @date 2021-10-19 14:32:18
|
||||
**/
|
||||
@Slf4j
|
||||
@ControllerAdvice(basePackages = "com.xuxd.kafka.console.controller")
|
||||
public class GlobalExceptionHandler {
|
||||
|
||||
@ResponseStatus(code = HttpStatus.FORBIDDEN)
|
||||
@ExceptionHandler(value = UnAuthorizedException.class)
|
||||
@ResponseBody
|
||||
public Object unAuthorizedExceptionHandler(HttpServletRequest req, Exception ex) throws Exception {
|
||||
log.error("unAuthorized: {}", ex.getMessage());
|
||||
return ResponseData.create().failed("UnAuthorized: " + ex.getMessage());
|
||||
}
|
||||
|
||||
@ExceptionHandler(value = Exception.class)
|
||||
@ResponseBody
|
||||
public Object exceptionHandler(HttpServletRequest req, Exception ex) throws Exception {
|
||||
|
||||
log.error("exception handle: ", ex);
|
||||
return ResponseData.create().failed(ex.getMessage());
|
||||
}
|
||||
}
|
||||
@@ -1,12 +0,0 @@
|
||||
package com.xuxd.kafka.console.exception;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/5/17 23:08
|
||||
**/
|
||||
public class UnAuthorizedException extends RuntimeException{
|
||||
|
||||
public UnAuthorizedException(String message) {
|
||||
super(message);
|
||||
}
|
||||
}
|
||||
@@ -1,70 +0,0 @@
|
||||
package com.xuxd.kafka.console.filter;
|
||||
|
||||
import com.xuxd.kafka.console.beans.Credentials;
|
||||
import com.xuxd.kafka.console.config.AuthConfig;
|
||||
import com.xuxd.kafka.console.utils.AuthUtil;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.springframework.core.annotation.Order;
|
||||
import org.springframework.http.HttpStatus;
|
||||
|
||||
import javax.servlet.*;
|
||||
import javax.servlet.annotation.WebFilter;
|
||||
import javax.servlet.http.HttpServletRequest;
|
||||
import javax.servlet.http.HttpServletResponse;
|
||||
import java.io.IOException;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/5/9 21:20
|
||||
**/
|
||||
@Order(1)
|
||||
@WebFilter(filterName = "auth-filter", urlPatterns = {"/*"})
|
||||
@Slf4j
|
||||
public class AuthFilter implements Filter {
|
||||
|
||||
private final AuthConfig authConfig;
|
||||
|
||||
private final String TOKEN_HEADER = "X-Auth-Token";
|
||||
|
||||
private final String AUTH_URI_PREFIX = "/auth";
|
||||
|
||||
public AuthFilter(AuthConfig authConfig) {
|
||||
this.authConfig = authConfig;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void doFilter(ServletRequest servletRequest, ServletResponse servletResponse, FilterChain filterChain) throws IOException, ServletException {
|
||||
if (!authConfig.isEnable()) {
|
||||
filterChain.doFilter(servletRequest, servletResponse);
|
||||
return;
|
||||
}
|
||||
HttpServletRequest request = (HttpServletRequest) servletRequest;
|
||||
HttpServletResponse response = (HttpServletResponse) servletResponse;
|
||||
String accessToken = request.getHeader(TOKEN_HEADER);
|
||||
|
||||
String requestURI = request.getRequestURI();
|
||||
if (requestURI.startsWith(AUTH_URI_PREFIX)) {
|
||||
filterChain.doFilter(servletRequest, servletResponse);
|
||||
return;
|
||||
}
|
||||
if (StringUtils.isEmpty(accessToken)) {
|
||||
response.setStatus(HttpStatus.UNAUTHORIZED.value());
|
||||
return;
|
||||
}
|
||||
|
||||
Credentials credentials = AuthUtil.parseToken(authConfig.getSecret(), accessToken);
|
||||
if (credentials.isInvalid()) {
|
||||
response.setStatus(HttpStatus.UNAUTHORIZED.value());
|
||||
return;
|
||||
}
|
||||
request.setAttribute("credentials", credentials);
|
||||
|
||||
try {
|
||||
CredentialsContext.set(credentials);
|
||||
filterChain.doFilter(servletRequest, servletResponse);
|
||||
}finally {
|
||||
CredentialsContext.remove();
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,23 +0,0 @@
|
||||
package com.xuxd.kafka.console.filter;
|
||||
|
||||
import com.xuxd.kafka.console.beans.Credentials;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/5/17 23:02
|
||||
**/
|
||||
public class CredentialsContext {
|
||||
private static final ThreadLocal<Credentials> CREDENTIALS = new ThreadLocal<>();
|
||||
|
||||
public static void set(Credentials credentials) {
|
||||
CREDENTIALS.set(credentials);
|
||||
}
|
||||
|
||||
public static Credentials get() {
|
||||
return CREDENTIALS.get();
|
||||
}
|
||||
|
||||
public static void remove() {
|
||||
CREDENTIALS.remove();
|
||||
}
|
||||
}
|
||||
@@ -1,4 +1,4 @@
|
||||
package com.xuxd.kafka.console.filter;
|
||||
package com.xuxd.kafka.console.interceptor;
|
||||
|
||||
import com.xuxd.kafka.console.beans.ResponseData;
|
||||
import com.xuxd.kafka.console.beans.dos.ClusterInfoDO;
|
||||
@@ -6,18 +6,20 @@ import com.xuxd.kafka.console.config.ContextConfig;
|
||||
import com.xuxd.kafka.console.config.ContextConfigHolder;
|
||||
import com.xuxd.kafka.console.dao.ClusterInfoMapper;
|
||||
import com.xuxd.kafka.console.utils.ConvertUtil;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.core.annotation.Order;
|
||||
import org.springframework.http.MediaType;
|
||||
|
||||
import javax.servlet.*;
|
||||
import javax.servlet.annotation.WebFilter;
|
||||
import javax.servlet.http.HttpServletRequest;
|
||||
import java.io.IOException;
|
||||
import java.util.HashSet;
|
||||
import java.util.Set;
|
||||
import javax.servlet.Filter;
|
||||
import javax.servlet.FilterChain;
|
||||
import javax.servlet.ServletException;
|
||||
import javax.servlet.ServletRequest;
|
||||
import javax.servlet.ServletResponse;
|
||||
import javax.servlet.annotation.WebFilter;
|
||||
import javax.servlet.http.HttpServletRequest;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.http.MediaType;
|
||||
|
||||
/**
|
||||
* kafka-console-ui.
|
||||
@@ -25,8 +27,7 @@ import java.util.Set;
|
||||
* @author xuxd
|
||||
* @date 2022-01-05 19:56:25
|
||||
**/
|
||||
@Order(100)
|
||||
@WebFilter(filterName = "context-set-filter", urlPatterns = {"/acl/*", "/user/*", "/cluster/*", "/config/*", "/consumer/*", "/message/*", "/topic/*", "/op/*", "/client/*"})
|
||||
@WebFilter(filterName = "context-set-filter", urlPatterns = {"/acl/*","/user/*","/cluster/*","/config/*","/consumer/*","/message/*","/topic/*","/op/*"})
|
||||
@Slf4j
|
||||
public class ContextSetFilter implements Filter {
|
||||
|
||||
@@ -41,9 +42,8 @@ public class ContextSetFilter implements Filter {
|
||||
@Autowired
|
||||
private ClusterInfoMapper clusterInfoMapper;
|
||||
|
||||
@Override
|
||||
public void doFilter(ServletRequest req, ServletResponse response,
|
||||
FilterChain chain) throws IOException, ServletException {
|
||||
@Override public void doFilter(ServletRequest req, ServletResponse response,
|
||||
FilterChain chain) throws IOException, ServletException {
|
||||
try {
|
||||
HttpServletRequest request = (HttpServletRequest) req;
|
||||
String uri = request.getRequestURI();
|
||||
@@ -72,7 +72,6 @@ public class ContextSetFilter implements Filter {
|
||||
config.setProperties(ConvertUtil.toProperties(infoDO.getProperties()));
|
||||
}
|
||||
ContextConfigHolder.CONTEXT_CONFIG.set(config);
|
||||
// log.info("current kafka config: {}", config);
|
||||
}
|
||||
}
|
||||
chain.doFilter(req, response);
|
||||
@@ -0,0 +1,25 @@
|
||||
package com.xuxd.kafka.console.interceptor;
|
||||
|
||||
import com.xuxd.kafka.console.utils.ResponseUtil;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.springframework.web.bind.annotation.ControllerAdvice;
|
||||
import org.springframework.web.bind.annotation.ExceptionHandler;
|
||||
import org.springframework.web.bind.annotation.ResponseBody;
|
||||
|
||||
/**
|
||||
* kafka-console-ui.
|
||||
*
|
||||
* @author xuxd
|
||||
* @date 2021-10-19 14:32:18
|
||||
**/
|
||||
@Slf4j
|
||||
@ControllerAdvice(basePackages = "com.xuxd.kafka.console")
|
||||
public class GlobalExceptionHandler {
|
||||
|
||||
@ExceptionHandler(value = Exception.class)
|
||||
@ResponseBody
|
||||
public Object exceptionHandler(Exception ex) {
|
||||
log.error("exception handle: ", ex);
|
||||
return ResponseUtil.error(ex.getMessage());
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,84 @@
|
||||
package com.xuxd.kafka.console.interceptor;
|
||||
|
||||
import com.xuxd.kafka.console.beans.ResponseData;
|
||||
import com.xuxd.kafka.console.beans.annotation.RequiredAuthorize;
|
||||
import com.xuxd.kafka.console.beans.enums.Role;
|
||||
import com.xuxd.kafka.console.beans.vo.DevOpsUserVO;
|
||||
import com.xuxd.kafka.console.service.DevOpsUserService;
|
||||
import com.xuxd.kafka.console.utils.ContextUtil;
|
||||
import com.xuxd.kafka.console.utils.ConvertUtil;
|
||||
import com.xuxd.kafka.console.utils.JwtUtils;
|
||||
import lombok.RequiredArgsConstructor;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.springframework.stereotype.Component;
|
||||
import org.springframework.web.method.HandlerMethod;
|
||||
import org.springframework.web.servlet.AsyncHandlerInterceptor;
|
||||
|
||||
import javax.servlet.http.HttpServletRequest;
|
||||
import javax.servlet.http.HttpServletResponse;
|
||||
import java.io.PrintWriter;
|
||||
|
||||
import static com.xuxd.kafka.console.beans.ResponseData.TOKEN_ILLEGAL;
|
||||
|
||||
@Component
|
||||
@Slf4j
|
||||
@RequiredArgsConstructor
|
||||
public class TokenInterceptor implements AsyncHandlerInterceptor {
|
||||
|
||||
private final static String TOKEN = "token";
|
||||
private final DevOpsUserService devOpsUserService;
|
||||
|
||||
@Override
|
||||
public boolean preHandle(HttpServletRequest request, HttpServletResponse response, Object handler) throws Exception {
|
||||
if (handler instanceof HandlerMethod){
|
||||
String token = request.getHeader(TOKEN);
|
||||
if (StringUtils.isBlank(token)){
|
||||
log.info("token not exist");
|
||||
write(response);
|
||||
return false;
|
||||
}
|
||||
|
||||
String username = JwtUtils.parse(token);
|
||||
if (StringUtils.isBlank(username)){
|
||||
log.info("{} is wrongful", token);
|
||||
write(response);
|
||||
return false;
|
||||
}
|
||||
|
||||
ResponseData<DevOpsUserVO> userVORsp = devOpsUserService.detail(username);
|
||||
if (userVORsp == null || userVORsp.getData() == null){
|
||||
log.info("{} not exist", username);
|
||||
write(response);
|
||||
return false;
|
||||
}
|
||||
|
||||
ContextUtil.set(ContextUtil.USERNAME, username);
|
||||
HandlerMethod method = (HandlerMethod)handler;
|
||||
RequiredAuthorize annotation = method.getMethodAnnotation(RequiredAuthorize.class);
|
||||
if (annotation != null){
|
||||
DevOpsUserVO userVO = userVORsp.getData();
|
||||
if (!userVO.getRole().equals(Role.manager)){
|
||||
log.info("{},{} no permission", username, request.getRequestURI());
|
||||
write(response);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
}
|
||||
return true;
|
||||
}
|
||||
|
||||
private void write(HttpServletResponse response){
|
||||
PrintWriter writer = null;
|
||||
try {
|
||||
writer = response.getWriter();
|
||||
writer.write(ConvertUtil.toJsonString(ResponseData.create().failed(TOKEN_ILLEGAL)));
|
||||
} catch (Exception ignored){
|
||||
} finally {
|
||||
if (writer != null){
|
||||
writer.flush();
|
||||
writer.close();
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,20 @@
|
||||
package com.xuxd.kafka.console.interceptor;
|
||||
|
||||
import lombok.RequiredArgsConstructor;
|
||||
import org.springframework.context.annotation.Configuration;
|
||||
import org.springframework.web.servlet.config.annotation.InterceptorRegistry;
|
||||
import org.springframework.web.servlet.config.annotation.WebMvcConfigurer;
|
||||
|
||||
@Configuration
|
||||
@RequiredArgsConstructor
|
||||
public class WebMvcConfig implements WebMvcConfigurer {
|
||||
|
||||
private final TokenInterceptor tokenInterceptor;
|
||||
|
||||
@Override
|
||||
public void addInterceptors(InterceptorRegistry registry) {
|
||||
registry.addInterceptor(tokenInterceptor)
|
||||
.addPathPatterns("/**")
|
||||
.excludePathPatterns("/devops/user/login");
|
||||
}
|
||||
}
|
||||
@@ -42,7 +42,4 @@ public interface AclService {
|
||||
|
||||
ResponseData getUserDetail(String username);
|
||||
|
||||
ResponseData clearAcl(AclEntry entry);
|
||||
|
||||
ResponseData getSaslScramUserList(AclEntry entry);
|
||||
}
|
||||
|
||||
@@ -1,13 +0,0 @@
|
||||
package com.xuxd.kafka.console.service;
|
||||
|
||||
import com.xuxd.kafka.console.beans.ResponseData;
|
||||
import com.xuxd.kafka.console.beans.dto.LoginUserDTO;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/5/14 19:00
|
||||
**/
|
||||
public interface AuthService {
|
||||
|
||||
ResponseData login(LoginUserDTO userDTO);
|
||||
}
|
||||
@@ -1,18 +0,0 @@
|
||||
package com.xuxd.kafka.console.service;
|
||||
|
||||
import com.xuxd.kafka.console.beans.ResponseData;
|
||||
import com.xuxd.kafka.console.beans.dto.AlterClientQuotaDTO;
|
||||
|
||||
import java.util.List;
|
||||
|
||||
/**
|
||||
* @author 晓东哥哥
|
||||
*/
|
||||
public interface ClientQuotaService {
|
||||
|
||||
ResponseData getClientQuotaConfigs(List<String> types, List<String> names);
|
||||
|
||||
ResponseData alterClientQuotaConfigs(AlterClientQuotaDTO request);
|
||||
|
||||
ResponseData deleteClientQuotaConfigs(AlterClientQuotaDTO request);
|
||||
}
|
||||
@@ -0,0 +1,26 @@
|
||||
package com.xuxd.kafka.console.service;
|
||||
|
||||
import com.xuxd.kafka.console.beans.ResponseData;
|
||||
import com.xuxd.kafka.console.beans.dto.user.AddUserDTO;
|
||||
import com.xuxd.kafka.console.beans.dto.user.ListUserDTO;
|
||||
import com.xuxd.kafka.console.beans.dto.user.UpdateUserDTO;
|
||||
import com.xuxd.kafka.console.beans.vo.DevOpsUserVO;
|
||||
import com.xuxd.kafka.console.beans.vo.LoginVO;
|
||||
|
||||
import java.util.List;
|
||||
|
||||
public interface DevOpsUserService {
|
||||
|
||||
ResponseData<Boolean> add(AddUserDTO addUserDTO);
|
||||
|
||||
ResponseData<Boolean> update(UpdateUserDTO updateUserDTO);
|
||||
|
||||
ResponseData<Boolean> delete(Long id);
|
||||
|
||||
ResponseData<List<DevOpsUserVO>> list(ListUserDTO listUserDTO);
|
||||
|
||||
ResponseData<DevOpsUserVO> detail(String username);
|
||||
|
||||
ResponseData<LoginVO> login(String username, String password);
|
||||
|
||||
}
|
||||
@@ -4,8 +4,6 @@ import com.xuxd.kafka.console.beans.QueryMessage;
|
||||
import com.xuxd.kafka.console.beans.ResponseData;
|
||||
import com.xuxd.kafka.console.beans.SendMessage;
|
||||
|
||||
import java.util.List;
|
||||
|
||||
/**
|
||||
* kafka-console-ui.
|
||||
*
|
||||
@@ -25,6 +23,4 @@ public interface MessageService {
|
||||
ResponseData send(SendMessage message);
|
||||
|
||||
ResponseData resend(SendMessage message);
|
||||
|
||||
ResponseData delete(List<QueryMessage> messages);
|
||||
}
|
||||
|
||||
@@ -4,8 +4,6 @@ import com.xuxd.kafka.console.beans.ReplicaAssignment;
|
||||
import com.xuxd.kafka.console.beans.ResponseData;
|
||||
import com.xuxd.kafka.console.beans.enums.TopicThrottleSwitch;
|
||||
import com.xuxd.kafka.console.beans.enums.TopicType;
|
||||
|
||||
import java.util.Collection;
|
||||
import java.util.List;
|
||||
import org.apache.kafka.clients.admin.NewTopic;
|
||||
|
||||
@@ -21,7 +19,7 @@ public interface TopicService {
|
||||
|
||||
ResponseData getTopicList(String topic, TopicType type);
|
||||
|
||||
ResponseData deleteTopics(Collection<String> topics);
|
||||
ResponseData deleteTopic(String topic);
|
||||
|
||||
ResponseData getTopicPartitionInfo(String topic);
|
||||
|
||||
|
||||
@@ -1,40 +0,0 @@
|
||||
package com.xuxd.kafka.console.service;
|
||||
|
||||
import com.xuxd.kafka.console.beans.ResponseData;
|
||||
import com.xuxd.kafka.console.beans.dto.SysPermissionDTO;
|
||||
import com.xuxd.kafka.console.beans.dto.SysRoleDTO;
|
||||
import com.xuxd.kafka.console.beans.dto.SysUserDTO;
|
||||
|
||||
/**
|
||||
* 登录用户权限管理.
|
||||
*
|
||||
* @author: xuxd
|
||||
* @date: 2023/4/11 21:24
|
||||
**/
|
||||
public interface UserManageService {
|
||||
|
||||
/**
|
||||
* 增加权限
|
||||
*/
|
||||
ResponseData addPermission(SysPermissionDTO permissionDTO);
|
||||
|
||||
ResponseData addOrUdpateRole(SysRoleDTO roleDTO);
|
||||
|
||||
ResponseData addOrUpdateUser(SysUserDTO userDTO);
|
||||
|
||||
ResponseData selectRole();
|
||||
|
||||
ResponseData selectPermission();
|
||||
|
||||
ResponseData selectUser();
|
||||
|
||||
ResponseData updateUser(SysUserDTO userDTO);
|
||||
|
||||
ResponseData updateRole(SysRoleDTO roleDTO);
|
||||
|
||||
ResponseData deleteRole(Long id);
|
||||
|
||||
ResponseData deleteUser(Long id);
|
||||
|
||||
ResponseData updatePassword(SysUserDTO userDTO);
|
||||
}
|
||||
@@ -10,23 +10,30 @@ import com.xuxd.kafka.console.config.ContextConfigHolder;
|
||||
import com.xuxd.kafka.console.dao.KafkaUserMapper;
|
||||
import com.xuxd.kafka.console.service.AclService;
|
||||
import com.xuxd.kafka.console.utils.SaslUtil;
|
||||
import java.util.Arrays;
|
||||
import java.util.Collections;
|
||||
import java.util.HashMap;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
import java.util.Properties;
|
||||
import java.util.Set;
|
||||
import java.util.stream.Collectors;
|
||||
import kafka.console.KafkaAclConsole;
|
||||
import kafka.console.KafkaConfigConsole;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.apache.kafka.clients.CommonClientConfigs;
|
||||
import org.apache.kafka.clients.admin.ScramMechanism;
|
||||
import org.apache.kafka.clients.admin.UserScramCredentialsDescription;
|
||||
import org.apache.kafka.common.acl.AclBinding;
|
||||
import org.apache.kafka.common.acl.AclOperation;
|
||||
import org.apache.kafka.common.config.SaslConfigs;
|
||||
import org.apache.kafka.common.errors.SecurityDisabledException;
|
||||
import org.apache.kafka.common.security.auth.SecurityProtocol;
|
||||
import org.springframework.beans.factory.ObjectProvider;
|
||||
import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.stereotype.Service;
|
||||
import scala.Tuple2;
|
||||
|
||||
import java.util.*;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
import static com.xuxd.kafka.console.utils.SaslUtil.isEnableSasl;
|
||||
import static com.xuxd.kafka.console.utils.SaslUtil.isEnableScram;
|
||||
|
||||
@@ -132,52 +139,39 @@ public class AclServiceImpl implements AclService {
|
||||
}
|
||||
|
||||
@Override public ResponseData getAclList(AclEntry entry) {
|
||||
List<AclBinding> aclBindingList = Collections.emptyList();
|
||||
try {
|
||||
aclBindingList = entry.isNull() ? aclConsole.getAclList(null) : aclConsole.getAclList(entry);
|
||||
}catch (Exception ex) {
|
||||
if (ex.getCause() instanceof SecurityDisabledException) {
|
||||
Throwable e = ex.getCause();
|
||||
log.info("SecurityDisabledException: {}", e.getMessage());
|
||||
Map<String, String> hint = new HashMap<>(2);
|
||||
hint.put("hint", "Security Disabled: " + e.getMessage());
|
||||
return ResponseData.create().data(hint).success();
|
||||
}
|
||||
throw new RuntimeException(ex.getCause());
|
||||
}
|
||||
// List<AclBinding> aclBindingList = entry.isNull() ? aclConsole.getAclList(null) : aclConsole.getAclList(entry);
|
||||
List<AclBinding> aclBindingList = entry.isNull() ? aclConsole.getAclList(null) : aclConsole.getAclList(entry);
|
||||
List<AclEntry> entryList = aclBindingList.stream().map(x -> AclEntry.valueOf(x)).collect(Collectors.toList());
|
||||
Map<String, List<AclEntry>> entryMap = entryList.stream().collect(Collectors.groupingBy(AclEntry::getPrincipal));
|
||||
Map<String, Object> resultMap = new HashMap<>();
|
||||
entryMap.forEach((k, v) -> {
|
||||
Map<String, List<AclEntry>> map = v.stream().collect(Collectors.groupingBy(e -> e.getResourceType() + "#" + e.getName()));
|
||||
// String username = SaslUtil.findUsername(ContextConfigHolder.CONTEXT_CONFIG.get().getProperties().getProperty(SaslConfigs.SASL_JAAS_CONFIG));
|
||||
// if (k.equals(username)) {
|
||||
// Map<String, Object> map2 = new HashMap<>(map);
|
||||
// Map<String, Object> userMap = new HashMap<>();
|
||||
// userMap.put("role", "admin");
|
||||
// map2.put("USER", userMap);
|
||||
// }
|
||||
String username = SaslUtil.findUsername(ContextConfigHolder.CONTEXT_CONFIG.get().getProperties().getProperty(SaslConfigs.SASL_JAAS_CONFIG));
|
||||
if (k.equals(username)) {
|
||||
Map<String, Object> map2 = new HashMap<>(map);
|
||||
Map<String, Object> userMap = new HashMap<>();
|
||||
userMap.put("role", "admin");
|
||||
map2.put("USER", userMap);
|
||||
}
|
||||
resultMap.put(k, map);
|
||||
});
|
||||
// if (entry.isNull() || StringUtils.isNotBlank(entry.getPrincipal())) {
|
||||
// Map<String, UserScramCredentialsDescription> detailList = configConsole.getUserDetailList(StringUtils.isNotBlank(entry.getPrincipal()) ? Collections.singletonList(entry.getPrincipal()) : null);
|
||||
//
|
||||
// detailList.values().forEach(u -> {
|
||||
// if (!resultMap.containsKey(u.name()) && !u.credentialInfos().isEmpty()) {
|
||||
// String username = SaslUtil.findUsername(ContextConfigHolder.CONTEXT_CONFIG.get().getProperties().getProperty(SaslConfigs.SASL_JAAS_CONFIG));
|
||||
// if (!u.name().equals(username)) {
|
||||
// resultMap.put(u.name(), Collections.emptyMap());
|
||||
// } else {
|
||||
// Map<String, Object> map2 = new HashMap<>();
|
||||
// Map<String, Object> userMap = new HashMap<>();
|
||||
// userMap.put("role", "admin");
|
||||
// map2.put("USER", userMap);
|
||||
// resultMap.put(u.name(), map2);
|
||||
// }
|
||||
// }
|
||||
// });
|
||||
// }
|
||||
if (entry.isNull() || StringUtils.isNotBlank(entry.getPrincipal())) {
|
||||
Map<String, UserScramCredentialsDescription> detailList = configConsole.getUserDetailList(StringUtils.isNotBlank(entry.getPrincipal()) ? Collections.singletonList(entry.getPrincipal()) : null);
|
||||
|
||||
detailList.values().forEach(u -> {
|
||||
if (!resultMap.containsKey(u.name()) && !u.credentialInfos().isEmpty()) {
|
||||
String username = SaslUtil.findUsername(ContextConfigHolder.CONTEXT_CONFIG.get().getProperties().getProperty(SaslConfigs.SASL_JAAS_CONFIG));
|
||||
if (!u.name().equals(username)) {
|
||||
resultMap.put(u.name(), Collections.emptyMap());
|
||||
} else {
|
||||
Map<String, Object> map2 = new HashMap<>();
|
||||
Map<String, Object> userMap = new HashMap<>();
|
||||
userMap.put("role", "admin");
|
||||
map2.put("USER", userMap);
|
||||
resultMap.put(u.name(), map2);
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
return ResponseData.create().data(new CounterMap<>(resultMap)).success();
|
||||
}
|
||||
@@ -242,37 +236,6 @@ public class AclServiceImpl implements AclService {
|
||||
return ResponseData.create().data(vo).success();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData clearAcl(AclEntry entry) {
|
||||
log.info("Start clear acl, principal: {}", entry);
|
||||
return aclConsole.deleteUserAcl(entry) ? ResponseData.create().success() : ResponseData.create().failed("操作失败");
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData getSaslScramUserList(AclEntry entry) {
|
||||
Map<String, Object> resultMap = new HashMap<>();
|
||||
if (entry.isNull() || StringUtils.isNotBlank(entry.getPrincipal())) {
|
||||
Map<String, UserScramCredentialsDescription> detailList = configConsole.getUserDetailList(StringUtils.isNotBlank(entry.getPrincipal()) ? Collections.singletonList(entry.getPrincipal()) : null);
|
||||
|
||||
detailList.values().forEach(u -> {
|
||||
if (!resultMap.containsKey(u.name()) && !u.credentialInfos().isEmpty()) {
|
||||
String username = SaslUtil.findUsername(ContextConfigHolder.CONTEXT_CONFIG.get().getProperties().getProperty(SaslConfigs.SASL_JAAS_CONFIG));
|
||||
if (!u.name().equals(username)) {
|
||||
resultMap.put(u.name(), Collections.emptyMap());
|
||||
} else {
|
||||
Map<String, Object> map2 = new HashMap<>();
|
||||
Map<String, Object> userMap = new HashMap<>();
|
||||
userMap.put("role", "admin");
|
||||
map2.put("USER", userMap);
|
||||
resultMap.put(u.name(), map2);
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
return ResponseData.create().data(new CounterMap<>(resultMap)).success();
|
||||
}
|
||||
|
||||
// @Override public void afterSingletonsInstantiated() {
|
||||
// if (kafkaConfig.isEnableAcl() && kafkaConfig.isAdminCreate()) {
|
||||
// log.info("Start create admin user, username: {}, password: {}", kafkaConfig.getAdminUsername(), kafkaConfig.getAdminPassword());
|
||||
|
||||
@@ -1,96 +0,0 @@
|
||||
package com.xuxd.kafka.console.service.impl;
|
||||
|
||||
import com.baomidou.mybatisplus.core.conditions.query.QueryWrapper;
|
||||
import com.xuxd.kafka.console.beans.Credentials;
|
||||
import com.xuxd.kafka.console.beans.LoginResult;
|
||||
import com.xuxd.kafka.console.beans.ResponseData;
|
||||
import com.xuxd.kafka.console.beans.dos.SysRoleDO;
|
||||
import com.xuxd.kafka.console.beans.dos.SysUserDO;
|
||||
import com.xuxd.kafka.console.beans.dto.LoginUserDTO;
|
||||
import com.xuxd.kafka.console.cache.RolePermCache;
|
||||
import com.xuxd.kafka.console.config.AuthConfig;
|
||||
import com.xuxd.kafka.console.dao.SysRoleMapper;
|
||||
import com.xuxd.kafka.console.dao.SysUserMapper;
|
||||
import com.xuxd.kafka.console.service.AuthService;
|
||||
import com.xuxd.kafka.console.utils.AuthUtil;
|
||||
import com.xuxd.kafka.console.utils.UUIDStrUtil;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.springframework.stereotype.Service;
|
||||
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.List;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/5/14 19:01
|
||||
**/
|
||||
@Slf4j
|
||||
@Service
|
||||
public class AuthServiceImpl implements AuthService {
|
||||
|
||||
private final SysUserMapper userMapper;
|
||||
|
||||
private final SysRoleMapper roleMapper;
|
||||
|
||||
private final AuthConfig authConfig;
|
||||
|
||||
private final RolePermCache rolePermCache;
|
||||
|
||||
public AuthServiceImpl(SysUserMapper userMapper,
|
||||
SysRoleMapper roleMapper,
|
||||
AuthConfig authConfig,
|
||||
RolePermCache rolePermCache) {
|
||||
this.userMapper = userMapper;
|
||||
this.roleMapper = roleMapper;
|
||||
this.authConfig = authConfig;
|
||||
this.rolePermCache = rolePermCache;
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData login(LoginUserDTO userDTO) {
|
||||
QueryWrapper<SysUserDO> queryWrapper = new QueryWrapper<>();
|
||||
queryWrapper.eq("username", userDTO.getUsername());
|
||||
SysUserDO userDO = userMapper.selectOne(queryWrapper);
|
||||
if (userDO == null) {
|
||||
return ResponseData.create().failed("用户名/密码不正确");
|
||||
}
|
||||
String encrypt = UUIDStrUtil.generate(userDTO.getPassword(), userDO.getSalt());
|
||||
if (!userDO.getPassword().equals(encrypt)) {
|
||||
return ResponseData.create().failed("用户名/密码不正确");
|
||||
}
|
||||
Credentials credentials = new Credentials();
|
||||
credentials.setUsername(userDO.getUsername());
|
||||
credentials.setExpiration(System.currentTimeMillis() + authConfig.getExpireHours() * 3600 * 1000);
|
||||
String token = AuthUtil.generateToken(authConfig.getSecret(), credentials);
|
||||
LoginResult loginResult = new LoginResult();
|
||||
List<String> permissions = new ArrayList<>();
|
||||
String roleIds = userDO.getRoleIds();
|
||||
if (StringUtils.isNotEmpty(roleIds)) {
|
||||
List<String> roleIdList = Arrays.stream(roleIds.split(",")).map(String::trim).filter(StringUtils::isNotEmpty).collect(Collectors.toList());
|
||||
roleIdList.forEach(roleId -> {
|
||||
Long rId = Long.valueOf(roleId);
|
||||
SysRoleDO roleDO = roleMapper.selectById(rId);
|
||||
String permissionIds = roleDO.getPermissionIds();
|
||||
if (StringUtils.isNotEmpty(permissionIds)) {
|
||||
List<Long> permIds = Arrays.stream(permissionIds.split(",")).map(String::trim).
|
||||
filter(StringUtils::isNotEmpty).map(Long::valueOf).collect(Collectors.toList());
|
||||
permIds.forEach(id -> {
|
||||
String permission = rolePermCache.getPermCache().get(id).getPermission();
|
||||
if (StringUtils.isNotEmpty(permission)) {
|
||||
permissions.add(permission);
|
||||
} else {
|
||||
log.error("角色:{},权限id: {},不存在", roleId, id);
|
||||
}
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
loginResult.setToken(token);
|
||||
loginResult.setPermissions(permissions);
|
||||
return ResponseData.create().data(loginResult).success();
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,172 +0,0 @@
|
||||
package com.xuxd.kafka.console.service.impl;
|
||||
|
||||
import com.xuxd.kafka.console.beans.ResponseData;
|
||||
import com.xuxd.kafka.console.beans.dto.AlterClientQuotaDTO;
|
||||
import com.xuxd.kafka.console.beans.vo.ClientQuotaEntityVO;
|
||||
import com.xuxd.kafka.console.service.ClientQuotaService;
|
||||
import kafka.console.ClientQuotaConsole;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.apache.commons.collections.CollectionUtils;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.apache.kafka.common.config.internals.QuotaConfigs;
|
||||
import org.apache.kafka.common.quota.ClientQuotaEntity;
|
||||
import org.springframework.stereotype.Service;
|
||||
import scala.Tuple2;
|
||||
|
||||
import java.util.*;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
/**
|
||||
* @author 晓东哥哥
|
||||
*/
|
||||
@Slf4j
|
||||
@Service
|
||||
public class ClientQuotaServiceImpl implements ClientQuotaService {
|
||||
|
||||
private final ClientQuotaConsole clientQuotaConsole;
|
||||
|
||||
private final Map<String, String> typeDict = new HashMap<>();
|
||||
|
||||
private final Map<String, String> configDict = new HashMap<>();
|
||||
|
||||
private final String USER = "user";
|
||||
private final String CLIENT_ID = "client-id";
|
||||
private final String IP = "ip";
|
||||
private final String USER_CLIENT = "user&client-id";
|
||||
|
||||
{
|
||||
typeDict.put(USER, ClientQuotaEntity.USER);
|
||||
typeDict.put(CLIENT_ID, ClientQuotaEntity.CLIENT_ID);
|
||||
typeDict.put(IP, ClientQuotaEntity.IP);
|
||||
typeDict.put(USER_CLIENT, USER_CLIENT);
|
||||
|
||||
configDict.put("producerRate", QuotaConfigs.PRODUCER_BYTE_RATE_OVERRIDE_CONFIG);
|
||||
configDict.put("consumerRate", QuotaConfigs.CONSUMER_BYTE_RATE_OVERRIDE_CONFIG);
|
||||
configDict.put("requestPercentage", QuotaConfigs.REQUEST_PERCENTAGE_OVERRIDE_CONFIG);
|
||||
}
|
||||
|
||||
public ClientQuotaServiceImpl(ClientQuotaConsole clientQuotaConsole) {
|
||||
this.clientQuotaConsole = clientQuotaConsole;
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData getClientQuotaConfigs(List<String> types, List<String> names) {
|
||||
List<String> entityNames = names == null ? Collections.emptyList() : new ArrayList<>(names);
|
||||
List<String> entityTypes = types.stream().map(e -> typeDict.get(e)).filter(e -> e != null).collect(Collectors.toList());
|
||||
if (entityTypes.isEmpty() || entityTypes.size() != types.size()) {
|
||||
throw new IllegalArgumentException("types illegal.");
|
||||
}
|
||||
|
||||
boolean userAndClientFilterClientOnly = false;
|
||||
// only type: [user and client-id], type.size == 2
|
||||
if (entityTypes.size() == 2) {
|
||||
if (names.size() == 2 && StringUtils.isBlank(names.get(0)) && StringUtils.isNotBlank(names.get(1))) {
|
||||
userAndClientFilterClientOnly = true;
|
||||
}
|
||||
}
|
||||
Map<ClientQuotaEntity, Map<String, Object>> clientQuotasConfigs = clientQuotaConsole.getClientQuotasConfigs(entityTypes,
|
||||
userAndClientFilterClientOnly ? Collections.emptyList() : entityNames);
|
||||
|
||||
List<ClientQuotaEntityVO> voList = clientQuotasConfigs.entrySet().stream().map(entry -> ClientQuotaEntityVO.from(
|
||||
entry.getKey(), entityTypes, entry.getValue())).collect(Collectors.toList());
|
||||
if (!userAndClientFilterClientOnly) {
|
||||
return ResponseData.create().data(voList).success();
|
||||
}
|
||||
List<ClientQuotaEntityVO> list = voList.stream().filter(e -> names.get(1).equals(e.getClient())).collect(Collectors.toList());
|
||||
|
||||
return ResponseData.create().data(list).success();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData alterClientQuotaConfigs(AlterClientQuotaDTO request) {
|
||||
|
||||
if (StringUtils.isEmpty(request.getType()) || !typeDict.containsKey(request.getType())) {
|
||||
return ResponseData.create().failed("Unknown type.");
|
||||
}
|
||||
|
||||
List<String> types = new ArrayList<>();
|
||||
List<String> names = new ArrayList<>();
|
||||
parseTypesAndNames(request, types, names, request.getType());
|
||||
Map<String, String> configsToBeAddedMap = new HashMap<>();
|
||||
|
||||
if (StringUtils.isNotEmpty(request.getProducerRate())) {
|
||||
configsToBeAddedMap.put(QuotaConfigs.PRODUCER_BYTE_RATE_OVERRIDE_CONFIG, String.valueOf(Math.floor(Double.valueOf(request.getProducerRate()))));
|
||||
}
|
||||
if (StringUtils.isNotEmpty(request.getConsumerRate())) {
|
||||
configsToBeAddedMap.put(QuotaConfigs.CONSUMER_BYTE_RATE_OVERRIDE_CONFIG, String.valueOf(Math.floor(Double.valueOf(request.getConsumerRate()))));
|
||||
}
|
||||
if (StringUtils.isNotEmpty(request.getRequestPercentage())) {
|
||||
configsToBeAddedMap.put(QuotaConfigs.REQUEST_PERCENTAGE_OVERRIDE_CONFIG, String.valueOf(Math.floor(Double.valueOf(request.getRequestPercentage()))));
|
||||
}
|
||||
|
||||
Tuple2<Object, String> tuple2 = clientQuotaConsole.addQuotaConfigs(types, names, configsToBeAddedMap);
|
||||
if (!(Boolean) tuple2._1) {
|
||||
return ResponseData.create().failed(tuple2._2);
|
||||
}
|
||||
if (CollectionUtils.isNotEmpty(request.getDeleteConfigs())) {
|
||||
List<String> delete = request.getDeleteConfigs().stream().map(key -> configDict.get(key)).collect(Collectors.toList());
|
||||
Tuple2<Object, String> tuple2Del = clientQuotaConsole.deleteQuotaConfigs(types, names, delete);
|
||||
if (!(Boolean) tuple2Del._1) {
|
||||
return ResponseData.create().failed(tuple2Del._2);
|
||||
}
|
||||
}
|
||||
return ResponseData.create().success();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData deleteClientQuotaConfigs(AlterClientQuotaDTO request) {
|
||||
if (StringUtils.isEmpty(request.getType()) || !typeDict.containsKey(request.getType())) {
|
||||
return ResponseData.create().failed("Unknown type.");
|
||||
}
|
||||
List<String> types = new ArrayList<>();
|
||||
List<String> names = new ArrayList<>();
|
||||
parseTypesAndNames(request, types, names, request.getType());
|
||||
List<String> configs = new ArrayList<>();
|
||||
configs.add(QuotaConfigs.PRODUCER_BYTE_RATE_OVERRIDE_CONFIG);
|
||||
configs.add(QuotaConfigs.CONSUMER_BYTE_RATE_OVERRIDE_CONFIG);
|
||||
configs.add(QuotaConfigs.REQUEST_PERCENTAGE_OVERRIDE_CONFIG);
|
||||
Tuple2<Object, String> tuple2 = clientQuotaConsole.deleteQuotaConfigs(types, names, configs);
|
||||
if (!(Boolean) tuple2._1) {
|
||||
return ResponseData.create().failed(tuple2._2);
|
||||
}
|
||||
return ResponseData.create().success();
|
||||
}
|
||||
|
||||
private void parseTypesAndNames(AlterClientQuotaDTO request, List<String> types, List<String> names, String type) {
|
||||
switch (request.getType()) {
|
||||
case USER:
|
||||
getTypesAndNames(request, types, names, USER);
|
||||
break;
|
||||
case CLIENT_ID:
|
||||
getTypesAndNames(request, types, names, CLIENT_ID);
|
||||
break;
|
||||
case IP:
|
||||
getTypesAndNames(request, types, names, IP);
|
||||
break;
|
||||
case USER_CLIENT:
|
||||
getTypesAndNames(request, types, names, USER);
|
||||
getTypesAndNames(request, types, names, CLIENT_ID);
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
private void getTypesAndNames(AlterClientQuotaDTO request, List<String> types, List<String> names, String type) {
|
||||
int index = -1;
|
||||
for (int i = 0; i < request.getTypes().size(); i++) {
|
||||
if (type.equals(request.getTypes().get(i))) {
|
||||
index = i;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (index < 0) {
|
||||
throw new IllegalArgumentException("Does not contain the type:" + type);
|
||||
}
|
||||
types.add(request.getTypes().get(index));
|
||||
if (CollectionUtils.isNotEmpty(request.getNames()) && request.getNames().size() > index) {
|
||||
names.add(request.getNames().get(index));
|
||||
} else {
|
||||
names.add("");
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,7 +1,6 @@
|
||||
package com.xuxd.kafka.console.service.impl;
|
||||
|
||||
import com.baomidou.mybatisplus.core.conditions.query.QueryWrapper;
|
||||
import com.xuxd.kafka.console.beans.BrokerNode;
|
||||
import com.xuxd.kafka.console.beans.ClusterInfo;
|
||||
import com.xuxd.kafka.console.beans.ResponseData;
|
||||
import com.xuxd.kafka.console.beans.dos.ClusterInfoDO;
|
||||
@@ -9,11 +8,15 @@ import com.xuxd.kafka.console.beans.vo.BrokerApiVersionVO;
|
||||
import com.xuxd.kafka.console.beans.vo.ClusterInfoVO;
|
||||
import com.xuxd.kafka.console.dao.ClusterInfoMapper;
|
||||
import com.xuxd.kafka.console.service.ClusterService;
|
||||
|
||||
import java.util.*;
|
||||
import java.util.ArrayList;
|
||||
import java.util.Arrays;
|
||||
import java.util.Collections;
|
||||
import java.util.Comparator;
|
||||
import java.util.HashMap;
|
||||
import java.util.List;
|
||||
import java.util.TreeSet;
|
||||
import java.util.stream.Collectors;
|
||||
import kafka.console.ClusterConsole;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.apache.commons.collections.CollectionUtils;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.apache.kafka.clients.NodeApiVersions;
|
||||
@@ -27,7 +30,6 @@ import org.springframework.stereotype.Service;
|
||||
* @author xuxd
|
||||
* @date 2021-10-08 14:23:09
|
||||
**/
|
||||
@Slf4j
|
||||
@Service
|
||||
public class ClusterServiceImpl implements ClusterService {
|
||||
|
||||
@@ -43,12 +45,7 @@ public class ClusterServiceImpl implements ClusterService {
|
||||
|
||||
@Override public ResponseData getClusterInfo() {
|
||||
ClusterInfo clusterInfo = clusterConsole.clusterInfo();
|
||||
Set<BrokerNode> nodes = clusterInfo.getNodes();
|
||||
if (nodes == null) {
|
||||
log.error("集群节点信息为空,集群地址可能不正确或集群内没有活跃节点");
|
||||
return ResponseData.create().failed("集群节点信息为空,集群地址可能不正确或集群内没有活跃节点");
|
||||
}
|
||||
clusterInfo.setNodes(new TreeSet<>(nodes));
|
||||
clusterInfo.setNodes(new TreeSet<>(clusterInfo.getNodes()));
|
||||
return ResponseData.create().data(clusterInfo).success();
|
||||
}
|
||||
|
||||
@@ -73,10 +70,6 @@ public class ClusterServiceImpl implements ClusterService {
|
||||
}
|
||||
|
||||
@Override public ResponseData updateClusterInfo(ClusterInfoDO infoDO) {
|
||||
if (infoDO.getProperties() == null) {
|
||||
// null 的话不更新,这个是bug,设置为空字符串解决
|
||||
infoDO.setProperties("");
|
||||
}
|
||||
clusterInfoMapper.updateById(infoDO);
|
||||
return ResponseData.create().success();
|
||||
}
|
||||
|
||||
@@ -0,0 +1,99 @@
|
||||
package com.xuxd.kafka.console.service.impl;
|
||||
|
||||
import com.baomidou.mybatisplus.core.conditions.query.QueryWrapper;
|
||||
import com.baomidou.mybatisplus.core.conditions.update.UpdateWrapper;
|
||||
import com.xuxd.kafka.console.beans.KafkaConsoleException;
|
||||
import com.xuxd.kafka.console.beans.ResponseData;
|
||||
import com.xuxd.kafka.console.beans.dos.DevOpsUserDO;
|
||||
import com.xuxd.kafka.console.beans.dto.user.AddUserDTO;
|
||||
import com.xuxd.kafka.console.beans.dto.user.ListUserDTO;
|
||||
import com.xuxd.kafka.console.beans.dto.user.UpdateUserDTO;
|
||||
import com.xuxd.kafka.console.beans.vo.DevOpsUserVO;
|
||||
import com.xuxd.kafka.console.beans.vo.LoginVO;
|
||||
import com.xuxd.kafka.console.boot.InitSuperDevOpsUser;
|
||||
import com.xuxd.kafka.console.dao.DevOpsUserMapper;
|
||||
import com.xuxd.kafka.console.service.DevOpsUserService;
|
||||
import com.xuxd.kafka.console.utils.ConvertUtil;
|
||||
import com.xuxd.kafka.console.utils.JwtUtils;
|
||||
import com.xuxd.kafka.console.utils.Md5Utils;
|
||||
import com.xuxd.kafka.console.utils.ResponseUtil;
|
||||
import lombok.RequiredArgsConstructor;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.springframework.stereotype.Service;
|
||||
|
||||
import java.util.List;
|
||||
|
||||
@Service
|
||||
@RequiredArgsConstructor
|
||||
public class DevOpsServiceImpl implements DevOpsUserService {
|
||||
|
||||
private final DevOpsUserMapper devOpsUserMapper;
|
||||
|
||||
@Override
|
||||
public ResponseData<Boolean> add(AddUserDTO addUserDTO) {
|
||||
QueryWrapper<DevOpsUserDO> queryWrapper = new QueryWrapper<DevOpsUserDO>();
|
||||
queryWrapper.eq("username", addUserDTO.getUsername());
|
||||
if (devOpsUserMapper.selectOne(queryWrapper) != null){
|
||||
throw new KafkaConsoleException("账号已存在");
|
||||
}
|
||||
|
||||
addUserDTO.setPassword(Md5Utils.MD5(addUserDTO.getPassword()));
|
||||
int ret = devOpsUserMapper.insert(ConvertUtil.copy(addUserDTO, DevOpsUserDO.class));
|
||||
return ResponseUtil.success(ret > 0);
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData<Boolean> update(UpdateUserDTO updateUserDTO) {
|
||||
UpdateWrapper<DevOpsUserDO> updateWrapper = new UpdateWrapper<>();
|
||||
if (updateUserDTO.getRole() != null){
|
||||
updateWrapper.set("role", updateUserDTO.getRole());
|
||||
}
|
||||
if (StringUtils.isNotBlank(updateUserDTO.getPassword())){
|
||||
updateWrapper.set("password", Md5Utils.MD5(updateUserDTO.getPassword()));
|
||||
}
|
||||
updateWrapper.eq("username", updateUserDTO.getUsername());
|
||||
int ret = devOpsUserMapper.update(null, updateWrapper);
|
||||
return ResponseUtil.success(ret > 0);
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData<Boolean> delete(Long id) {
|
||||
int ret = devOpsUserMapper.deleteById(id);
|
||||
return ResponseUtil.success(ret > 0);
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData<List<DevOpsUserVO>> list(ListUserDTO listUserDTO) {
|
||||
QueryWrapper<DevOpsUserDO> queryWrapper = new QueryWrapper<DevOpsUserDO>();
|
||||
if (listUserDTO.getId() != null){
|
||||
queryWrapper.eq("id", listUserDTO.getId());
|
||||
}
|
||||
if (StringUtils.isNotBlank(listUserDTO.getUsername())){
|
||||
queryWrapper.eq("username", listUserDTO.getUsername());
|
||||
}
|
||||
queryWrapper.ne("username", InitSuperDevOpsUser.SUPER_USERNAME);
|
||||
List<DevOpsUserDO> userDOS = devOpsUserMapper.selectList(queryWrapper);
|
||||
return ResponseUtil.success(ConvertUtil.copyList(userDOS, DevOpsUserVO.class));
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData<DevOpsUserVO> detail(String username) {
|
||||
QueryWrapper<DevOpsUserDO> queryWrapper = new QueryWrapper<DevOpsUserDO>();
|
||||
queryWrapper.eq("username", username);
|
||||
DevOpsUserDO userDO = devOpsUserMapper.selectOne(queryWrapper);
|
||||
return ResponseUtil.success(ConvertUtil.copy(userDO, DevOpsUserVO.class));
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData<LoginVO> login(String username, String password) {
|
||||
QueryWrapper<DevOpsUserDO> queryWrapper = new QueryWrapper<DevOpsUserDO>();
|
||||
queryWrapper.eq("username", username);
|
||||
queryWrapper.eq("password", Md5Utils.MD5(password));
|
||||
DevOpsUserDO userDO = devOpsUserMapper.selectOne(queryWrapper);
|
||||
if (userDO == null){
|
||||
throw new KafkaConsoleException("用户名或密码错误");
|
||||
}
|
||||
LoginVO loginVO = LoginVO.builder().role(userDO.getRole()).token(JwtUtils.sign(username)).build();
|
||||
return ResponseUtil.success(loginVO);
|
||||
}
|
||||
}
|
||||
@@ -24,7 +24,6 @@ import kafka.console.TopicConsole;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.apache.commons.collections.CollectionUtils;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.apache.kafka.clients.admin.RecordsToDelete;
|
||||
import org.apache.kafka.clients.admin.TopicDescription;
|
||||
import org.apache.kafka.clients.consumer.ConsumerRecord;
|
||||
import org.apache.kafka.clients.producer.ProducerRecord;
|
||||
@@ -243,18 +242,6 @@ public class MessageServiceImpl implements MessageService, ApplicationContextAwa
|
||||
return success ? ResponseData.create().success("success: " + tuple2._2()) : ResponseData.create().failed(tuple2._2());
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData delete(List<QueryMessage> messages) {
|
||||
Map<TopicPartition, RecordsToDelete> params = new HashMap<>(messages.size(), 1f);
|
||||
|
||||
messages.forEach(message -> {
|
||||
params.put(new TopicPartition(message.getTopic(), message.getPartition()), RecordsToDelete.beforeOffset(message.getOffset()));
|
||||
});
|
||||
Tuple2<Object, String> tuple2 = messageConsole.delete(params);
|
||||
boolean success = (boolean) tuple2._1();
|
||||
return success ? ResponseData.create().success() : ResponseData.create().failed(tuple2._2());
|
||||
}
|
||||
|
||||
private Map<TopicPartition, ConsumerRecord<byte[], byte[]>> searchRecordByOffset(QueryMessage queryMessage) {
|
||||
Set<TopicPartition> partitions = getPartitions(queryMessage);
|
||||
|
||||
|
||||
@@ -9,6 +9,16 @@ import com.xuxd.kafka.console.beans.vo.TopicDescriptionVO;
|
||||
import com.xuxd.kafka.console.beans.vo.TopicPartitionVO;
|
||||
import com.xuxd.kafka.console.service.TopicService;
|
||||
import com.xuxd.kafka.console.utils.GsonUtil;
|
||||
import java.util.Calendar;
|
||||
import java.util.Collections;
|
||||
import java.util.Comparator;
|
||||
import java.util.HashMap;
|
||||
import java.util.HashSet;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
import java.util.Set;
|
||||
import java.util.concurrent.atomic.AtomicLong;
|
||||
import java.util.stream.Collectors;
|
||||
import kafka.console.MessageConsole;
|
||||
import kafka.console.TopicConsole;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
@@ -23,10 +33,6 @@ import org.springframework.beans.factory.annotation.Autowired;
|
||||
import org.springframework.stereotype.Service;
|
||||
import scala.Tuple2;
|
||||
|
||||
import java.util.*;
|
||||
import java.util.concurrent.atomic.AtomicLong;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
/**
|
||||
* kafka-console-ui.
|
||||
*
|
||||
@@ -81,8 +87,8 @@ public class TopicServiceImpl implements TopicService {
|
||||
return ResponseData.create().data(topicDescriptions.stream().map(d -> TopicDescriptionVO.from(d))).success();
|
||||
}
|
||||
|
||||
@Override public ResponseData deleteTopics(Collection<String> topics) {
|
||||
Tuple2<Object, String> tuple2 = topicConsole.deleteTopics(topics);
|
||||
@Override public ResponseData deleteTopic(String topic) {
|
||||
Tuple2<Object, String> tuple2 = topicConsole.deleteTopic(topic);
|
||||
return (Boolean) tuple2._1 ? ResponseData.create().success() : ResponseData.create().failed(tuple2._2);
|
||||
}
|
||||
|
||||
|
||||
@@ -1,230 +0,0 @@
|
||||
package com.xuxd.kafka.console.service.impl;
|
||||
|
||||
import com.baomidou.mybatisplus.core.conditions.query.QueryWrapper;
|
||||
import com.xuxd.kafka.console.beans.ResponseData;
|
||||
import com.xuxd.kafka.console.beans.RolePermUpdateEvent;
|
||||
import com.xuxd.kafka.console.beans.dos.SysPermissionDO;
|
||||
import com.xuxd.kafka.console.beans.dos.SysRoleDO;
|
||||
import com.xuxd.kafka.console.beans.dos.SysUserDO;
|
||||
import com.xuxd.kafka.console.beans.dto.SysPermissionDTO;
|
||||
import com.xuxd.kafka.console.beans.dto.SysRoleDTO;
|
||||
import com.xuxd.kafka.console.beans.dto.SysUserDTO;
|
||||
import com.xuxd.kafka.console.beans.vo.SysPermissionVO;
|
||||
import com.xuxd.kafka.console.beans.vo.SysRoleVO;
|
||||
import com.xuxd.kafka.console.beans.vo.SysUserVO;
|
||||
import com.xuxd.kafka.console.dao.SysPermissionMapper;
|
||||
import com.xuxd.kafka.console.dao.SysRoleMapper;
|
||||
import com.xuxd.kafka.console.dao.SysUserMapper;
|
||||
import com.xuxd.kafka.console.service.UserManageService;
|
||||
import com.xuxd.kafka.console.utils.RandomStringUtil;
|
||||
import com.xuxd.kafka.console.utils.UUIDStrUtil;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.apache.commons.lang3.StringUtils;
|
||||
import org.springframework.beans.factory.ObjectProvider;
|
||||
import org.springframework.context.ApplicationEventPublisher;
|
||||
import org.springframework.stereotype.Service;
|
||||
|
||||
import java.util.*;
|
||||
import java.util.function.Function;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/4/11 21:24
|
||||
**/
|
||||
@Slf4j
|
||||
@Service
|
||||
public class UserManageServiceImpl implements UserManageService {
|
||||
|
||||
private final SysUserMapper userMapper;
|
||||
|
||||
private final SysRoleMapper roleMapper;
|
||||
|
||||
private final SysPermissionMapper permissionMapper;
|
||||
|
||||
private final ApplicationEventPublisher publisher;
|
||||
|
||||
public UserManageServiceImpl(ObjectProvider<SysUserMapper> userMapper,
|
||||
ObjectProvider<SysRoleMapper> roleMapper,
|
||||
ObjectProvider<SysPermissionMapper> permissionMapper,
|
||||
ApplicationEventPublisher publisher) {
|
||||
this.userMapper = userMapper.getIfAvailable();
|
||||
this.roleMapper = roleMapper.getIfAvailable();
|
||||
this.permissionMapper = permissionMapper.getIfAvailable();
|
||||
this.publisher = publisher;
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData addPermission(SysPermissionDTO permissionDTO) {
|
||||
permissionMapper.insert(permissionDTO.toSysPermissionDO());
|
||||
return ResponseData.create().success();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData addOrUdpateRole(SysRoleDTO roleDTO) {
|
||||
SysRoleDO roleDO = roleDTO.toDO();
|
||||
if (roleDO.getId() == null) {
|
||||
roleMapper.insert(roleDO);
|
||||
} else {
|
||||
roleMapper.updateById(roleDO);
|
||||
}
|
||||
publisher.publishEvent(new RolePermUpdateEvent(this));
|
||||
return ResponseData.create().success();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData addOrUpdateUser(SysUserDTO userDTO) {
|
||||
|
||||
if (userDTO.getId() == null) {
|
||||
if (StringUtils.isEmpty(userDTO.getPassword())) {
|
||||
userDTO.setPassword(RandomStringUtil.random6Str());
|
||||
}
|
||||
SysUserDO userDO = userDTO.toDO();
|
||||
QueryWrapper<SysUserDO> queryWrapper = new QueryWrapper<>();
|
||||
queryWrapper.eq(true, "username", userDO.getUsername());
|
||||
SysUserDO exist = userMapper.selectOne(queryWrapper);
|
||||
if (exist != null) {
|
||||
return ResponseData.create().failed("用户已存在:" + userDO.getUsername());
|
||||
}
|
||||
userDO.setSalt(UUIDStrUtil.random());
|
||||
userDO.setPassword(UUIDStrUtil.generate(userDTO.getPassword(), userDO.getSalt()));
|
||||
userMapper.insert(userDO);
|
||||
} else {
|
||||
SysUserDO userDO = userMapper.selectById(userDTO.getId());
|
||||
if (userDO == null) {
|
||||
log.error("查不到用户: {}", userDTO.getId());
|
||||
return ResponseData.create().failed("Unknown User.");
|
||||
}
|
||||
// 判断是否更新密码
|
||||
if (userDTO.getResetPassword()) {
|
||||
userDTO.setPassword(RandomStringUtil.random6Str());
|
||||
userDO.setSalt(UUIDStrUtil.random());
|
||||
userDO.setPassword(UUIDStrUtil.generate(userDTO.getPassword(), userDO.getSalt()));
|
||||
}
|
||||
userDO.setRoleIds(userDTO.getRoleIds());
|
||||
userDO.setUsername(userDTO.getUsername());
|
||||
userMapper.updateById(userDO);
|
||||
}
|
||||
return ResponseData.create().data(userDTO.getPassword()).success();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData selectRole() {
|
||||
List<SysRoleDO> dos = roleMapper.selectList(new QueryWrapper<>());
|
||||
return ResponseData.create().data(dos.stream().map(SysRoleVO::from).collect(Collectors.toList())).success();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData selectPermission() {
|
||||
QueryWrapper<SysPermissionDO> queryWrapper = new QueryWrapper<>();
|
||||
|
||||
List<SysPermissionDO> permissionDOS = permissionMapper.selectList(queryWrapper);
|
||||
List<SysPermissionVO> vos = new ArrayList<>();
|
||||
Map<Long, Integer> posMap = new HashMap<>();
|
||||
Map<Long, SysPermissionVO> voMap = new HashMap<>();
|
||||
|
||||
Iterator<SysPermissionDO> iterator = permissionDOS.iterator();
|
||||
while (iterator.hasNext()) {
|
||||
SysPermissionDO permissionDO = iterator.next();
|
||||
if (permissionDO.getParentId() == null) {
|
||||
// 菜单
|
||||
SysPermissionVO vo = SysPermissionVO.from(permissionDO);
|
||||
vos.add(vo);
|
||||
int index = vos.size() - 1;
|
||||
// 记录位置
|
||||
posMap.put(permissionDO.getId(), index);
|
||||
iterator.remove();
|
||||
}
|
||||
}
|
||||
// 上面把菜单都处理过了
|
||||
while (!permissionDOS.isEmpty()) {
|
||||
iterator = permissionDOS.iterator();
|
||||
while (iterator.hasNext()) {
|
||||
SysPermissionDO permissionDO = iterator.next();
|
||||
Long parentId = permissionDO.getParentId();
|
||||
if (posMap.containsKey(parentId)) {
|
||||
// 菜单下的按扭
|
||||
SysPermissionVO vo = SysPermissionVO.from(permissionDO);
|
||||
Integer index = posMap.get(parentId);
|
||||
SysPermissionVO menuVO = vos.get(index);
|
||||
if (menuVO.getChildren() == null) {
|
||||
menuVO.setChildren(new ArrayList<>());
|
||||
}
|
||||
menuVO.getChildren().add(vo);
|
||||
voMap.put(permissionDO.getId(), vo);
|
||||
iterator.remove();
|
||||
} else if (voMap.containsKey(parentId)) {
|
||||
// 按钮下的按扭
|
||||
SysPermissionVO vo = SysPermissionVO.from(permissionDO);
|
||||
SysPermissionVO buttonVO = voMap.get(parentId);
|
||||
if (buttonVO.getChildren() == null) {
|
||||
buttonVO.setChildren(new ArrayList<>());
|
||||
}
|
||||
buttonVO.getChildren().add(vo);
|
||||
voMap.put(permissionDO.getId(), vo);
|
||||
iterator.remove();
|
||||
}
|
||||
}
|
||||
}
|
||||
return ResponseData.create().data(vos).success();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData selectUser() {
|
||||
QueryWrapper<SysUserDO> queryWrapper = new QueryWrapper<>();
|
||||
List<SysUserDO> userDOS = userMapper.selectList(queryWrapper);
|
||||
List<SysRoleDO> roleDOS = roleMapper.selectList(null);
|
||||
Map<Long, SysRoleDO> roleDOMap = roleDOS.stream().collect(Collectors.toMap(SysRoleDO::getId, Function.identity(), (e1, e2) -> e1));
|
||||
List<SysUserVO> voList = userDOS.stream().map(SysUserVO::from).collect(Collectors.toList());
|
||||
voList.forEach(vo -> {
|
||||
if (vo.getRoleIds() != null) {
|
||||
Long roleId = Long.valueOf(vo.getRoleIds());
|
||||
vo.setRoleNames(roleDOMap.containsKey(roleId) ? roleDOMap.get(roleId).getRoleName() : null);
|
||||
}
|
||||
});
|
||||
return ResponseData.create().data(voList).success();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData updateUser(SysUserDTO userDTO) {
|
||||
userMapper.updateById(userDTO.toDO());
|
||||
return ResponseData.create().success();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData updateRole(SysRoleDTO roleDTO) {
|
||||
roleMapper.updateById(roleDTO.toDO());
|
||||
publisher.publishEvent(new RolePermUpdateEvent(this));
|
||||
return ResponseData.create().success();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData deleteRole(Long id) {
|
||||
QueryWrapper<SysUserDO> queryWrapper = new QueryWrapper<>();
|
||||
queryWrapper.eq(true, "role_ids", id);
|
||||
Integer count = userMapper.selectCount(queryWrapper);
|
||||
if (count > 0) {
|
||||
return ResponseData.create().failed("存在用户被分配为当前角色,不允许删除");
|
||||
}
|
||||
roleMapper.deleteById(id);
|
||||
publisher.publishEvent(new RolePermUpdateEvent(this));
|
||||
return ResponseData.create().success();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData deleteUser(Long id) {
|
||||
userMapper.deleteById(id);
|
||||
return ResponseData.create().success();
|
||||
}
|
||||
|
||||
@Override
|
||||
public ResponseData updatePassword(SysUserDTO userDTO) {
|
||||
SysUserDO userDO = userDTO.toDO();
|
||||
userDO.setSalt(UUIDStrUtil.random());
|
||||
userDO.setPassword(UUIDStrUtil.generate(userDTO.getPassword(), userDO.getSalt()));
|
||||
QueryWrapper<SysUserDO> wrapper = new QueryWrapper<>();
|
||||
wrapper.eq("username", userDTO.getUsername());
|
||||
userMapper.update(userDO, wrapper);
|
||||
return ResponseData.create().success();
|
||||
}
|
||||
}
|
||||
@@ -1,54 +0,0 @@
|
||||
package com.xuxd.kafka.console.utils;
|
||||
|
||||
import com.google.gson.Gson;
|
||||
import com.xuxd.kafka.console.beans.Credentials;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.springframework.util.Base64Utils;
|
||||
|
||||
import java.nio.charset.StandardCharsets;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/5/14 19:34
|
||||
**/
|
||||
@Slf4j
|
||||
public class AuthUtil {
|
||||
|
||||
private static Gson gson = GsonUtil.INSTANCE.get();
|
||||
|
||||
public static String generateToken(String secret, Credentials info) {
|
||||
String json = gson.toJson(info);
|
||||
String str = json + secret;
|
||||
String signature = MD5Util.md5(str);
|
||||
return Base64Utils.encodeToString(json.getBytes(StandardCharsets.UTF_8)) + "." +
|
||||
Base64Utils.encodeToString(signature.getBytes(StandardCharsets.UTF_8));
|
||||
}
|
||||
|
||||
public static boolean isToken(String token) {
|
||||
return token.split("\\.").length == 2;
|
||||
}
|
||||
|
||||
public static Credentials parseToken(String secret, String token) {
|
||||
if (!isToken(token)) {
|
||||
return Credentials.INVALID;
|
||||
}
|
||||
String[] arr = token.split("\\.");
|
||||
String infoStr = new String(Base64Utils.decodeFromString(arr[0]), StandardCharsets.UTF_8);
|
||||
String signature = new String(Base64Utils.decodeFromString(arr[1]), StandardCharsets.UTF_8);
|
||||
|
||||
String encrypt = MD5Util.md5(infoStr + secret);
|
||||
if (!encrypt.equals(signature)) {
|
||||
return Credentials.INVALID;
|
||||
}
|
||||
try {
|
||||
Credentials credentials = gson.fromJson(infoStr, Credentials.class);
|
||||
if (credentials.getExpiration() < System.currentTimeMillis()) {
|
||||
return Credentials.INVALID;
|
||||
}
|
||||
return credentials;
|
||||
} catch (Exception e) {
|
||||
log.error("解析token失败: {}", token, e);
|
||||
return Credentials.INVALID;
|
||||
}
|
||||
}
|
||||
}
|
||||
23
src/main/java/com/xuxd/kafka/console/utils/ContextUtil.java
Normal file
23
src/main/java/com/xuxd/kafka/console/utils/ContextUtil.java
Normal file
@@ -0,0 +1,23 @@
|
||||
package com.xuxd.kafka.console.utils;
|
||||
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
|
||||
public class ContextUtil {
|
||||
|
||||
public static final String USERNAME = "username" ;
|
||||
|
||||
private static ThreadLocal<Map<String, Object>> context = ThreadLocal.withInitial(() -> new HashMap<>());
|
||||
|
||||
public static void set(String key, Object value){
|
||||
context.get().put(key, value);
|
||||
}
|
||||
|
||||
public static String get(String key){
|
||||
return (String) context.get().get(key);
|
||||
}
|
||||
|
||||
public static void clear(){
|
||||
context.remove();
|
||||
}
|
||||
}
|
||||
@@ -1,17 +1,15 @@
|
||||
package com.xuxd.kafka.console.utils;
|
||||
|
||||
import com.google.common.base.Preconditions;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.springframework.cglib.beans.BeanCopier;
|
||||
import org.springframework.objenesis.ObjenesisStd;
|
||||
import org.springframework.util.ClassUtils;
|
||||
|
||||
import java.io.ByteArrayInputStream;
|
||||
import java.io.IOException;
|
||||
import java.util.Arrays;
|
||||
import java.util.HashMap;
|
||||
import java.util.Iterator;
|
||||
import java.util.LinkedList;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
import java.util.Properties;
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
import org.springframework.util.ClassUtils;
|
||||
import java.util.*;
|
||||
import java.util.concurrent.ConcurrentHashMap;
|
||||
|
||||
/**
|
||||
* kafka-console-ui.
|
||||
@@ -22,6 +20,47 @@ import org.springframework.util.ClassUtils;
|
||||
@Slf4j
|
||||
public class ConvertUtil {
|
||||
|
||||
private static ThreadLocal<ObjenesisStd> objenesisStdThreadLocal = ThreadLocal.withInitial(ObjenesisStd::new);
|
||||
private static ConcurrentHashMap<Class<?>, ConcurrentHashMap<Class<?>, BeanCopier>> cache = new ConcurrentHashMap<>();
|
||||
|
||||
public static <T> T copy(Object source, Class<T> target) {
|
||||
return copy(source, objenesisStdThreadLocal.get().newInstance(target));
|
||||
}
|
||||
|
||||
public static <T> T copy(Object source, T target) {
|
||||
if (null == source) {
|
||||
return null;
|
||||
}
|
||||
BeanCopier beanCopier = getCacheBeanCopier(source.getClass(), target.getClass());
|
||||
beanCopier.copy(source, target, null);
|
||||
return target;
|
||||
}
|
||||
|
||||
public static <T> List<T> copyList(List<?> sources, Class<T> target) {
|
||||
if (sources.isEmpty()) {
|
||||
return Collections.emptyList();
|
||||
}
|
||||
|
||||
ArrayList<T> list = new ArrayList<>(sources.size());
|
||||
ObjenesisStd objenesisStd = objenesisStdThreadLocal.get();
|
||||
for (Object source : sources) {
|
||||
if (source == null) {
|
||||
break;
|
||||
}
|
||||
T newInstance = objenesisStd.newInstance(target);
|
||||
BeanCopier beanCopier = getCacheBeanCopier(source.getClass(), target);
|
||||
beanCopier.copy(source, newInstance, null);
|
||||
list.add(newInstance);
|
||||
}
|
||||
return list;
|
||||
}
|
||||
|
||||
private static <S, T> BeanCopier getCacheBeanCopier(Class<S> source, Class<T> target) {
|
||||
ConcurrentHashMap<Class<?>, BeanCopier> copierConcurrentHashMap =
|
||||
cache.computeIfAbsent(source, aClass -> new ConcurrentHashMap<>(16));
|
||||
return copierConcurrentHashMap.computeIfAbsent(target, aClass -> BeanCopier.create(source, target, false));
|
||||
}
|
||||
|
||||
public static Map<String, Object> toMap(Object src) {
|
||||
Preconditions.checkNotNull(src);
|
||||
Map<String, Object> res = new HashMap<>();
|
||||
|
||||
43
src/main/java/com/xuxd/kafka/console/utils/JwtUtils.java
Normal file
43
src/main/java/com/xuxd/kafka/console/utils/JwtUtils.java
Normal file
@@ -0,0 +1,43 @@
|
||||
package com.xuxd.kafka.console.utils;
|
||||
|
||||
import io.jsonwebtoken.Claims;
|
||||
import io.jsonwebtoken.Jwts;
|
||||
import io.jsonwebtoken.SignatureAlgorithm;
|
||||
|
||||
import java.util.Date;
|
||||
import java.util.HashMap;
|
||||
import java.util.Map;
|
||||
|
||||
public class JwtUtils {
|
||||
|
||||
private static final String ISSUER = "kafka-console-ui";
|
||||
private static final long EXPIRE_TIME = 5 * 24 * 60 * 60 * 1000;
|
||||
private static final String PRIVATE_KEY = "~hello!kafka=console^ui";
|
||||
|
||||
public static String sign(String username){
|
||||
Map<String,Object> header = new HashMap<>();
|
||||
header.put("typ","JWT");
|
||||
header.put("alg","HS256");
|
||||
Map<String,Object> claims = new HashMap<>();
|
||||
claims.put("username", username);
|
||||
return Jwts.builder()
|
||||
.setIssuer(ISSUER)
|
||||
.setHeader(header)
|
||||
.setClaims(claims)
|
||||
.setIssuedAt(new Date())
|
||||
.setExpiration(new Date(System.currentTimeMillis() + EXPIRE_TIME))
|
||||
.signWith(SignatureAlgorithm.HS256, PRIVATE_KEY)
|
||||
.compact();
|
||||
}
|
||||
|
||||
public static String parse(String token){
|
||||
try{
|
||||
Claims claims = Jwts.parser()
|
||||
.setSigningKey(PRIVATE_KEY)
|
||||
.parseClaimsJws(token).getBody();
|
||||
return (String) claims.get("username");
|
||||
}catch (Exception e){
|
||||
return null;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,32 +0,0 @@
|
||||
package com.xuxd.kafka.console.utils;
|
||||
|
||||
import lombok.extern.slf4j.Slf4j;
|
||||
|
||||
import java.nio.charset.StandardCharsets;
|
||||
import java.security.MessageDigest;
|
||||
import java.security.NoSuchAlgorithmException;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/5/14 20:25
|
||||
**/
|
||||
@Slf4j
|
||||
public class MD5Util {
|
||||
|
||||
public static MessageDigest getInstance() {
|
||||
try {
|
||||
MessageDigest md5 = MessageDigest.getInstance("MD5");
|
||||
return md5;
|
||||
} catch (NoSuchAlgorithmException e) {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
public static String md5(String str) {
|
||||
MessageDigest digest = getInstance();
|
||||
if (digest == null) {
|
||||
return null;
|
||||
}
|
||||
return new String(digest.digest(str.getBytes(StandardCharsets.UTF_8)), StandardCharsets.UTF_8);
|
||||
}
|
||||
}
|
||||
12
src/main/java/com/xuxd/kafka/console/utils/Md5Utils.java
Normal file
12
src/main/java/com/xuxd/kafka/console/utils/Md5Utils.java
Normal file
@@ -0,0 +1,12 @@
|
||||
package com.xuxd.kafka.console.utils;
|
||||
|
||||
import org.springframework.util.DigestUtils;
|
||||
|
||||
import java.nio.charset.StandardCharsets;
|
||||
|
||||
public class Md5Utils {
|
||||
|
||||
public static String MD5(String s) {
|
||||
return DigestUtils.md5DigestAsHex(s.getBytes(StandardCharsets.UTF_8));
|
||||
}
|
||||
}
|
||||
@@ -1,26 +0,0 @@
|
||||
package com.xuxd.kafka.console.utils;
|
||||
|
||||
import java.util.Random;
|
||||
|
||||
/**
|
||||
* @author: xuxd
|
||||
* @date: 2023/5/8 9:19
|
||||
**/
|
||||
public class RandomStringUtil {
|
||||
|
||||
private final static String ALLOWED_CHARS = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789";
|
||||
|
||||
public static String random6Str() {
|
||||
return generateRandomString(6);
|
||||
}
|
||||
|
||||
public static String generateRandomString(int length) {
|
||||
Random random = new Random();
|
||||
StringBuilder sb = new StringBuilder(length);
|
||||
for (int i = 0; i < length; i++) {
|
||||
int index = random.nextInt(ALLOWED_CHARS.length());
|
||||
sb.append(ALLOWED_CHARS.charAt(index));
|
||||
}
|
||||
return sb.toString();
|
||||
}
|
||||
}
|
||||
15
src/main/java/com/xuxd/kafka/console/utils/ResponseUtil.java
Normal file
15
src/main/java/com/xuxd/kafka/console/utils/ResponseUtil.java
Normal file
@@ -0,0 +1,15 @@
|
||||
package com.xuxd.kafka.console.utils;
|
||||
|
||||
import com.xuxd.kafka.console.beans.ResponseData;
|
||||
|
||||
public class ResponseUtil {
|
||||
|
||||
public static <T> ResponseData<T> success(T data) {
|
||||
return ResponseData.create().data(data);
|
||||
}
|
||||
|
||||
public static ResponseData<String> error(String msg) {
|
||||
return ResponseData.create().failed(msg);
|
||||
}
|
||||
|
||||
}
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user