Linkis-Cli usage documentation

    You can simply submit a task to Linkis by referring to the example below

    The first step is to check whether the default configuration file exists in the conf/ directory, and it contains the following configuration:

    The second step is to enter the linkis installation directory and enter the command:

    1. sh ./bin/linkis-cli -engineType spark-2.4.3 -codeType sql -code "select count(*) from testdb.test;" -submitUser hadoop -proxyUser hadoop

    In the third step, you will see the information on the console that the task has been submitted to linkis and started to execute.

    Linkis-cli currently only supports synchronous submission, that is, after submitting a task to linkis, it will continue to inquire about the task status and pull task logs until the task ends. If the status is successful at the end of the task, linkis-cli will also actively pull the result set and output it.

    1. sh ./bin/linkis-cli [parameter] [cli parameter]
    • cli parameters

    One, add cli parameters

    Cli parameters can be passed in manually specified, this way will overwrite the conflicting configuration items in the default configuration file

    1. sh ./bin/linkis-cli -engineType spark-2.4.3 -codeType sql -code "select count(*) from testdb.test;" -submitUser hadoop -proxyUser hadoop --gwUrl http://127.0.0.1:9001- -authStg token --authKey [tokenKey] --authVal [tokenValue]

    Two, add engine initial parameters

    1. -confMap key1=val1,key2=val2,...

    For example: the following example sets startup parameters such as the yarn queue for engine startup and the number of spark executors:

    1. sh ./bin/linkis-cli -engineType spark-2.4.3 -codeType sql -confMap wds.linkis.yarnqueue=q02,spark.executor.instances=3 -code "select count(*) from testdb.test;" -submitUser hadoop -proxyUser hadoop

    Of course, these parameters can also be read in a configuration file, we will talk about it later

    Three, add tags

    Labels can be added through the -labelMap parameter. Like the -confMap, the type of the -labelMap parameter is also Map:

    Fourth, variable replacement

    Linkis-cli variable substitution is realized by ${} symbol and

    1. sh ./bin/linkis-cli -engineType spark-2.4.3 -codeType sql -code "select count(*) from \${key};" -varMap key=testdb.test -submitUser hadoop -proxyUser hadoop

    During execution, the sql statement will be replaced with:

    1. select count(*) from testdb.test

    Note that the escape character in '\$' is to prevent the parameter from being parsed in advance by linux. If -codePath specifies the local script mode, the escape character is not required

    Five, use user configuration

    1. linkis-cli supports loading user-defined configuration files, the configuration file path is specified by the --userConf parameter, and the configuration file needs to be in the file format of .properties
    1. sh ./bin/linkis-cli -engineType spark-2.4.3 -codeType sql -code "select count(*) from testdb.test;" -submitUser hadoop -proxyUser hadoop --userConf [configuration file path]
    1. Which parameters can be configured?

    All parameters can be configured, for example:

    cli parameters:

    1. wds.linkis.client.common.gatewayUrl=http://127.0.0.1:9001
    2. wds.linkis.client.common.authStrategy=static
    3. wds.linkis.client.common.tokenKey=[tokenKey]
    1. wds.linkis.client.label.engineType=spark-2.4.3
    2. wds.linkis.client.label.codeType=sql

    When the Map class parameters are configured, the format of the key is

    The Map prefix includes:

    • sourceMap prefix: wds.linkis.client.source
    • ConfigurationMap prefix: wds.linkis.client.param.conf
    • runtimeMap prefix: wds.linkis.client.param.runtime
    • labelMap prefix: wds.linkis.client.label

    Note:

    1. variableMap does not support configuration

    2. When there is a conflict between the configured key and the key entered in the command parameter, the priority is as follows:

      1. Instruction Parameters> Key in Instruction Map Type Parameters> User Configuration> Default Configuration

    Example:

    Configure engine startup parameters:

    1. wds.linkis.client.param.conf.spark.executor.instances=3
    2. wds.linkis.client.param.conf.wds.linkis.yarnqueue=q02

    Configure labelMap parameters:

    1. wds.linkis.client.label.myLabel=label123

    Six, output result set to file

    1. task-[taskId]-result-[idx].txt

    E.g:

    1. task-906-result-1.txt
    2. task-906-result-2.txt