Linkis Task submission and execution Rest API document
- The return of the Linkis Restful interface follows the following standard return format:
- method: Returns the requested Restful API URI, which is mainly used in WebSocket mode.
- status: return status information, where: -1 means no login, 0 means success, 1 means error, 2 means verification failed, 3 means no access to the interface.
- data: return specific data.
- message: return the requested prompt message. If the status is not 0, the message returned is an error message, and the data may have a stack field, which returns specific stack information.
For more information about the Linkis Restful interface specification, please refer to: Linkis Restful Interface Specification
Interface
Submission method
POST
Request Parameters
{
"executionContent": {
"code": "show tables",
"runType": "sql"
},
"params": {
"variable": {// task variable
"testvar": "hello"
},
"configuration": {
"runtime": {// task runtime params
"jdbc.url": "XX"
},
"startup": { // ec start up params
"spark.executor.cores": "4"
}
}
},
"source": { //task source information
"scriptPath": "file:///tmp/hadoop/test.sql"
},
"labels": {
"engineType": "spark-2.4.3",
"userCreator": "hadoop-IDE"
}
}
-Sample Response
{
"method": "/api/rest_j/v1/entrance/submit",
"status": 0,
"message": "Request executed successfully",
"data": {
"execID": "030418IDEhivebdpdwc010004:10087IDE_hadoop_21",
"taskID": "123"
}
}
execID is the unique identification execution ID generated for the task after the user task is submitted to Linkis. It is of type String. This ID is only useful when the task is running, similar to the concept of PID. The design of ExecID is
(requestApplicationName length)(executeAppName length)(Instance length)${requestApplicationName}${executeApplicationName}${entranceInstance information ip+port}${requestApplicationName}_${umUser}_${index}
taskID is the unique ID that represents the task submitted by the user. This ID is generated by the database self-increment and is of Long type
2. Get Status
Interface
/api/rest_j/v1/entrance/${execID}/status
Submission method
GET
Sample Response
{
"method": "/api/rest_j/v1/entrance/{execID}/status",
"status": 0,
"message": "Get status successful",
"data": {
"execID": "${execID}",
"status": "Running"
}
}
3. Get Logs
Interface
/api/rest_j/v1/entrance/${execID}/log?fromLine=${fromLine}&size=${size}
The request parameter fromLine refers to the number of lines from which to get, and size refers to the number of lines of logs that this request gets
Sample Response, where the returned fromLine needs to be used as a parameter for the next request of this interface
Submission method
GET
Sample Response
{
"method": "/api/entrance/exec_id018017linkis-cg-entrance127.0.0.1:9205IDE_hadoop_spark_2/progressWithResource",
"status": 0,
"message": "OK",
"data": {
"yarnMetrics": {
"yarnResource": [
{
"queueMemory": 9663676416,
"queueInstances": 0,
"jobStatus": "COMPLETED",
"applicationId": "application_1655364300926_69504",
"queue": "default"
}
],
"memoryPercent": 0.009,
"memoryRGB": "green",
"coreRGB": "green",
"corePercent": 0.02
},
"progress": 0.5,
"progressInfo": [
{
"succeedTasks": 4,
"failedTasks": 0,
"id": "jobId-1(linkis-spark-mix-code-1946915)",
"totalTasks": 6,
"runningTasks": 0
}
],
"execID": "exec_id018017linkis-cg-entrance127.0.0.1:9205IDE_hadoop_spark_2"
}
}
5. Kill Task
Interface
/api/rest_j/v1/entrance/${execID}/kill
Submission method
POST
Sample Response
{
"method": "/api/rest_j/v1/entrance/{execID}/kill",
"status": 0,
"message": "OK",
"data": {
"execID":"${execID}"
}
}
6. Get task info
Interface
/api/rest_j/v1/jobhistory/{id}/get
Submission method
GET
Request Parameters:
- Sample Response
{
"method": null,
"status": 0,
"message": "OK",
"data": {
"task": {
"taskID": 1,
"instance": "xxx",
"execId": "exec-id-xxx",
"umUser": "test",
"engineInstance": "xxx",
"progress": "10%",
"logPath": "hdfs://xxx/xxx/xxx",
"resultLocation": "hdfs://xxx/xxx/xxx",
"status": "FAILED",
"createdTime": "2019-01-01 00:00:00",
"updatedTime": "2019-01-01 01:00:00",
"engineType": "spark",
"errorCode": 100,
"errDesc": "Task Failed with error code 100",
"executeApplicationName": "hello world",
"runType": "xxx",
"paramJson": "{\"xxx\":\"xxx\"}",
"costTime": 10000,
"sourceJson": "{\"xxx\":\"xxx\"}"
}
}
}
Interface
/api/rest_j/v1/filesystem/getDirFileTrees
Submission method
GET
Request Parameters:
- Sample Response
8. Get result content
Interface
/api/rest_j/v1/filesystem/openFile
Submission method
GET
Request Parameters:
- Sample Response
{
"method": "/api/filesystem/openFile",
"status": 0,
"message": "OK",
"data": {
"metadata": [
{
"columnName": "count(1)",
"comment": "NULL",
"dataType": "long"
}
],
"totalPage": 0,
"totalLine": 1,
"page": 1,
"type": "2",
"fileContent": [
[
"28"
]
]
}
}
9. Get Result by stream
Get the result as a CSV or Excel file
Interface
/api/rest_j/v1/filesystem/resultsetToExcel
Submission method
GET
Request Parameters:
- Response
binary stream
Interface
/api/rest_j/v1/entrance/execute
Request Parameters
{
"executeApplicationName": "hive", //Engine type
"requestApplicationName": "dss", //Client service type
"executionCode": "show tables",
"params": {
"variable": {// task variable
"testvar": "hello"
},
"configuration": {
"runtime": {// task runtime params
"jdbc.url": "XX"
},
"startup": { // ec start up params
"spark.executor.cores": "4"
}
}
},
"source": { //task source information
"scriptPath": "file:///tmp/hadoop/test.sql"
},
"labels": {
"engineType": "spark-2.4.3",
"userCreator": "hadoop-IDE"
},
"runType": "hql", //The type of script to run
}