Linkis Task submission and execution Rest API document

    • The return of the Linkis Restful interface follows the following standard return format:
    • method: Returns the requested Restful API URI, which is mainly used in WebSocket mode.
    • status: return status information, where: -1 means no login, 0 means success, 1 means error, 2 means verification failed, 3 means no access to the interface.
    • data: return specific data.
    • message: return the requested prompt message. If the status is not 0, the message returned is an error message, and the data may have a stack field, which returns specific stack information.

    For more information about the Linkis Restful interface specification, please refer to: Linkis Restful Interface Specification

    • Interface

    • Submission method POST

    • Request Parameters

    1. {
    2. "executionContent": {
    3. "code": "show tables",
    4. "runType": "sql"
    5. },
    6. "params": {
    7. "variable": {// task variable
    8. "testvar": "hello"
    9. },
    10. "configuration": {
    11. "runtime": {// task runtime params
    12. "jdbc.url": "XX"
    13. },
    14. "startup": { // ec start up params
    15. "spark.executor.cores": "4"
    16. }
    17. }
    18. },
    19. "source": { //task source information
    20. "scriptPath": "file:///tmp/hadoop/test.sql"
    21. },
    22. "labels": {
    23. "engineType": "spark-2.4.3",
    24. "userCreator": "hadoop-IDE"
    25. }
    26. }

    -Sample Response

    1. {
    2. "method": "/api/rest_j/v1/entrance/submit",
    3. "status": 0,
    4. "message": "Request executed successfully",
    5. "data": {
    6. "execID": "030418IDEhivebdpdwc010004:10087IDE_hadoop_21",
    7. "taskID": "123"
    8. }
    9. }
    • execID is the unique identification execution ID generated for the task after the user task is submitted to Linkis. It is of type String. This ID is only useful when the task is running, similar to the concept of PID. The design of ExecID is (requestApplicationName length)(executeAppName length)(Instance length)${requestApplicationName}${executeApplicationName}${entranceInstance information ip+port}${requestApplicationName}_${umUser}_${index}

    • taskID is the unique ID that represents the task submitted by the user. This ID is generated by the database self-increment and is of Long type

    2. Get Status

    • Interface /api/rest_j/v1/entrance/${execID}/status

    • Submission method GET

    • Sample Response

    1. {
    2. "method": "/api/rest_j/v1/entrance/{execID}/status",
    3. "status": 0,
    4. "message": "Get status successful",
    5. "data": {
    6. "execID": "${execID}",
    7. "status": "Running"
    8. }
    9. }

    3. Get Logs

    • Interface /api/rest_j/v1/entrance/${execID}/log?fromLine=${fromLine}&size=${size}

    • The request parameter fromLine refers to the number of lines from which to get, and size refers to the number of lines of logs that this request gets

    • Sample Response, where the returned fromLine needs to be used as a parameter for the next request of this interface

    • Submission method GET

    • Sample Response

    1. {
    2. "method": "/api/entrance/exec_id018017linkis-cg-entrance127.0.0.1:9205IDE_hadoop_spark_2/progressWithResource",
    3. "status": 0,
    4. "message": "OK",
    5. "data": {
    6. "yarnMetrics": {
    7. "yarnResource": [
    8. {
    9. "queueMemory": 9663676416,
    10. "queueInstances": 0,
    11. "jobStatus": "COMPLETED",
    12. "applicationId": "application_1655364300926_69504",
    13. "queue": "default"
    14. }
    15. ],
    16. "memoryPercent": 0.009,
    17. "memoryRGB": "green",
    18. "coreRGB": "green",
    19. "corePercent": 0.02
    20. },
    21. "progress": 0.5,
    22. "progressInfo": [
    23. {
    24. "succeedTasks": 4,
    25. "failedTasks": 0,
    26. "id": "jobId-1(linkis-spark-mix-code-1946915)",
    27. "totalTasks": 6,
    28. "runningTasks": 0
    29. }
    30. ],
    31. "execID": "exec_id018017linkis-cg-entrance127.0.0.1:9205IDE_hadoop_spark_2"
    32. }
    33. }

    5. Kill Task

    • Interface /api/rest_j/v1/entrance/${execID}/kill

    • Submission method POST

    • Sample Response

    1. {
    2. "method": "/api/rest_j/v1/entrance/{execID}/kill",
    3. "status": 0,
    4. "message": "OK",
    5. "data": {
    6. "execID":"${execID}"
    7. }
    8. }

    6. Get task info

    • Interface /api/rest_j/v1/jobhistory/{id}/get

    • Submission method GET

    Request Parameters:

    • Sample Response
    1. {
    2. "method": null,
    3. "status": 0,
    4. "message": "OK",
    5. "data": {
    6. "task": {
    7. "taskID": 1,
    8. "instance": "xxx",
    9. "execId": "exec-id-xxx",
    10. "umUser": "test",
    11. "engineInstance": "xxx",
    12. "progress": "10%",
    13. "logPath": "hdfs://xxx/xxx/xxx",
    14. "resultLocation": "hdfs://xxx/xxx/xxx",
    15. "status": "FAILED",
    16. "createdTime": "2019-01-01 00:00:00",
    17. "updatedTime": "2019-01-01 01:00:00",
    18. "engineType": "spark",
    19. "errorCode": 100,
    20. "errDesc": "Task Failed with error code 100",
    21. "executeApplicationName": "hello world",
    22. "runType": "xxx",
    23. "paramJson": "{\"xxx\":\"xxx\"}",
    24. "costTime": 10000,
    25. "sourceJson": "{\"xxx\":\"xxx\"}"
    26. }
    27. }
    28. }
    • Interface /api/rest_j/v1/filesystem/getDirFileTrees

    • Submission method GET

    Request Parameters:

    • Sample Response

    8. Get result content

    • Interface /api/rest_j/v1/filesystem/openFile

    • Submission method GET

    Request Parameters:

    • Sample Response
    1. {
    2. "method": "/api/filesystem/openFile",
    3. "status": 0,
    4. "message": "OK",
    5. "data": {
    6. "metadata": [
    7. {
    8. "columnName": "count(1)",
    9. "comment": "NULL",
    10. "dataType": "long"
    11. }
    12. ],
    13. "totalPage": 0,
    14. "totalLine": 1,
    15. "page": 1,
    16. "type": "2",
    17. "fileContent": [
    18. [
    19. "28"
    20. ]
    21. ]
    22. }
    23. }

    9. Get Result by stream

    Get the result as a CSV or Excel file

    • Interface /api/rest_j/v1/filesystem/resultsetToExcel

    • Submission method GET

    Request Parameters:

    • Response
    1. binary stream
    • Interface /api/rest_j/v1/entrance/execute

    • Request Parameters

    1. {
    2. "executeApplicationName": "hive", //Engine type
    3. "requestApplicationName": "dss", //Client service type
    4. "executionCode": "show tables",
    5. "params": {
    6. "variable": {// task variable
    7. "testvar": "hello"
    8. },
    9. "configuration": {
    10. "runtime": {// task runtime params
    11. "jdbc.url": "XX"
    12. },
    13. "startup": { // ec start up params
    14. "spark.executor.cores": "4"
    15. }
    16. }
    17. },
    18. "source": { //task source information
    19. "scriptPath": "file:///tmp/hadoop/test.sql"
    20. },
    21. "labels": {
    22. "engineType": "spark-2.4.3",
    23. "userCreator": "hadoop-IDE"
    24. },
    25. "runType": "hql", //The type of script to run
    26. }