In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Servers >
Share
Shulou(Shulou.com)05/31 Report--
It is believed that many inexperienced people do not know what to do about how to access Linkis through HTTP. Therefore, this paper summarizes the causes and solutions of the problem. Through this article, I hope you can solve this problem.
The introduction of DSS (DataSphereStudio) and Linkis will not be repeated in this article, and the following focus will be on connecting third-party applications to DSS through http (rest). "
I. Environmental description
Environment description: dss-0.7.0,linkis- 0.9.3 II. Log in to login
Send the request by okhttp,RestTemplate,HTTPClient,json and java bean by fastjson,gson,Jackson; because the login status of dss is kept in cookie. After n times of thunder, the request is made by RestTemplate (spring-web:5.0.7) + fastjson (1.2.58), and the login cookie is passed to other subsequent requests.
POST / api/rest_j/v1/user/login request parameters {"userName": "hadoop", "password": "hadoop"} return example (return json may be slightly modified by me, whichever is official) {"method": null, "status": 0, "message": "login successful (login successful)!" , "data": {"isFirstLogin": false, "isAdmin": true, "userName": "hadoop"}} import org.apache.http.impl.client.CloseableHttpClient;import org.apache.http.impl.client.HttpClientBuilder;import org.springframework.http.client.HttpComponentsClientHttpRequestFactory;import org.springframework.web.client.RestTemplate
/ * simple tool class, which can be extended according to your own needs * / public class HttpUtil {
Public static RestTemplate getRestClient () {/ / the following code HttpClientBuilder automatically manages cookie. / / if HttpClientBuilder.create () .disableCookieManagement, disable the management CloseableHttpClient build of cookie = HttpClientBuilder.create () .useSystemProperties () .build (); return new RestTemplate (new HttpComponentsClientHttpRequestFactory (build));}
}
Import org.springframework.http.ResponseEntity; / * send a post request to login * @ param restClient * @ return ResponseEntity * / private ResponseEntity login (RestTemplate restClient) {JSONObject postData = new JSONObject (); postData.put ("userName", "hadoop"); postData.put ("password", "hadoop") String loginUrl = "http://ip:port/api/rest_j/v1/user/login"; return restClient.postForEntity (loginUrl, postData, JSONObject.class);}
Third, execute the task execute
The source code is in the EntranceRestfulApi of linkis module
POST / api/rest_j/v1/entrance/execute request parameters {"method": "/ api/rest_j/v1/entrance/execute", "params": {"variable": {"K1": "v1"}, "configuration": {"special": {"K2": "v2"} "runtime": {"K3": "v3"}, "startup": {"K4": "v4"}}, "executeApplicationName": "spark", "executionCode": "show tables", "runType": "sql", "source": {"scriptPath": "/ home/Linkis/Linkis.sql"}}
Return example {"method": "/ api/rest_j/v1/entrance/execute", "status": 0, "message": "request executed successfully", "data": {"execID": "030418IDEhivebdpdwc010004:10087IDE_johnnwang_21", / / execute id, and then get the task status "taskID": "123" / / task id based on it. Later, according to it or the execution file}} / * * @ param restClient * @ param sql, the sql code to be executed * @ return * / private ResponseEntity executeSql (RestTemplate restClient, String sql) {String url = "/ api/rest_j/v1/entrance/execute" JSONObject map = new JSONObject (); map.put ("method", url); map.put ("params", new HashMap ()); / / the parameter specified by the user to run the service program is required. The value in it can be empty map.put ("executeApplicationName", "hive"); / / execution engine, I use hive map.put ("executionCode", sql) Map.put ("runType", "sql"); / / when users execute services such as spark, they can select python, R, SQL, etc., which cannot be empty / / because I did not execute the file script, there is no scriptPath parameter String executeSql = "http://ip:port" + url; return restClient.postForEntity (executeSql, map, JSONObject.class);}
4. Check the task status status
The source code is in the EntranceRestfulApi of linkis module
GET / api/rest_j/v1/entrance/$ {execID} / status
Return the example
{"method": "/ api/rest_j/v1/entrance/ {execID} / status", "status": 0, "message": "get status successful", "data": {"execID": "${execID}", "status": "Running"}} String statusUrl = "http://ip:port/api/rest_j/v1/entrance/" + execID +" / status "; ResponseEntity statusResp = restTemplate.getForEntity (statusUrl, JSONObject.class) If (statusResp! = null & & statusResp.getStatusCode (). Value () = HttpStatus.SC_OK) {String status; for (;;) {statusResp = restTemplate.getForEntity (statusUrl, JSONObject.class); status = statusResp.getBody () .getJSONObject ("data") .getString ("status") / / check the status of the task in an endless loop. If the task succeeds or fails, exit the loop if ("Succeed" .equals (status) | | "Failed" .equals (status)) {break;}} if ("Succeed" .equals (status)) {/ / do something}}
5. Obtain the execution result file get
The source code is in the linkis module QueryRestfulApi
GET / api/rest_j/v1/jobhistory/$ {taskId} / get
Return the example
{"method": "/ api/jobhistory/ {id} / get", "status": 0, "message": "OK", "data": {"task": {"taskID": 3111, "instance": "test-dn2:9108", "execId": "IDE_hadoop_46", "umUser": "hadoop" "engineInstance": "test-dn2:37301", "executionCode": "show databases", / / sql "progress": 1.0, "logPath": "file:///linkis/hadoop/log/IDE/2020-09-08/3111.log",// log path" resultLocation ":" hdfs:///linkis2/hadoop/dwc/20200908/IDE/3111 " / / the file path "status": "Succeed", "createdTime": 1599551337000, "updatedTime": 1599551339000, "engineType": null, "errCode": null, "errDesc": null, "executeApplicationName": "hive", "requestApplicationName": "IDE", "scriptPath": null RunType: "sql", "paramsJson": "{}", "costTime": 2000, "strongerExecId": "030413IDEhivetest-dn2:9108IDE_hadoop_46", "sourceJson": "{\" scriptPath\ ": null}"}
String historyUrl = "http://ip:port/api/rest_j/v1/jobhistory/" + taskID +" / get "; ResponseEntity hisResp = restTemplate.getForEntity (historyUrl, JSONObject.class); if (hisResp! = null & & hisResp.getStatusCode (). Value () = HttpStatus.SC_OK) {String resultLocation = hisResp.getBody (). GetJSONObject (" data "). GetJSONObject (" task "). GetString (" resultLocation ");}
6. Open the result file openFile
The source code is in the linkis module FsRestfulApi
GET / api/rest_j/v1/filesystem/openFile?path=$ {resultLocation} / _ 0.dolphin
Return the example
{"method": "/ api/filesystem/openFile", "status": 0, "message": "OK", "data": {"sql query result data"}} String resUrl = "http://ip:port/api/rest_j/v1/filesystem/openFile?path=" + resultLocation +" / _ 0.dolphin "; ResponseEntity resResp = restTemplate.getForEntity (resUrl, JSONObject.class) If (resrespends = null & & resResp.getStatusCode (). Value () = HttpStatus.SC_OK) {/ / do something} after reading the above, have you mastered how third-party applications can access Linkis through HTTP? If you want to learn more skills or want to know more about it, you are welcome to follow the industry information channel, thank you for reading!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.