# Data Stream Tasks

For integration with external Information systems (IS), where it is required to obtain data using pull technology, the following can be used:

  • Template tasks, supplied in the monq distribution for common ISs.
  • Custom tasks can be written to create your own integrations and get data on them.

Reference

Technology pull (pull technology, pull coding or client pull) is a network communication technology in which the initial data request is made by the client, and the response is generated by the server.

A Task is a script written in the Yaml markup language.

The result of the script execution is an "artifact" that is further passed to a monq Data Stream via REST API.

Data Stream tasks can be executed either on a

User Guide for working with Tasks

# Template tasks

A detailed description of Template Tasks can be found in this section of documentation.

# Custom Tasks

Custom Tasks for Data Streams are required in order to implement integrations with external Information systems that need to be connected to monq for events to flow.

To create integration with an external Information system, you have to:

  1. Create a Data Stream with Configuration Template AnyStream or Without Template (the difference is explained in this section of documentation).

  2. Add a Task on the Configuration tab of the Data Stream.

  3. Write a Task script to be executed on a connected Agent.

    Examples of scripts

  4. Save and start the Data Stream.

# Description of Task script features

# Script fields

Name Type Required Template Description
name string - - Script name
env arbitrary dictionary - + Global environment variables available anywhere in the script.
settings settings object - - Script execution settings.
jobs array of service jobs + - Service jobs of the script that run in parallel.

# Service job fields

Name Type Required Template Description
name string - - Service job name
env arbitrary dictionary - + Environment variables available anywhere in the service job. The values ​​overwrite the values ​​of the matching keys in the script dictionary.
settings settings object - - Service job settings.
steps array of steps + - Service job steps that run sequentially.
store arbitrary dictionary - + Variables stored in the global cache whose values ​​are passed to the next launch of the script. The values ​​overwrite the values ​​of the matching keys in the step dictionaries. Also, the values ​​of the matching keys are overwritten by the values ​​in the dictionaries of other service jobs performed later.
artifacts array of artifacts - - Artifacts generated by service job.

# Service job step fields

Name Type Required Template Description
name string - - The name of the service job step.
env arbitrary dictionary - + Environment variables available anywhere in the step. The values ​​overwrite the values ​​of the matching keys in the script and service job dictionaries.
settings settings object - - Execution settings for the service job step.
run string + + Console command to be executed on the agent.
plugin string + + Agent plugin command to be executed on the agent.
with arbitrary dictionary - + Executable command variables.
with-secured arbitrary dictionary - + Protected variables of the executed command, the values ​​of which are hidden in the logs.
outputs arbitrary dictionary - + Return values.
store arbitrary dictionary - + Variables stored in the global cache whose values ​​are passed to the next launch of the script.

# Settings fields

Name Type Required Template Description
shell string - - OS shell. Either a path to the executable file or one of the predefined shells can be specified.

# Predefined shells

Value Substitute command
cmd cmd.exe /C <command>
bash bash -c "<command>"
powershell powershell.exe -Command <command>
pwsh pwsh -Command <command>
sh sh -c '<command>'

For various OS families, default values are used:

  • Windows - cmd
  • Unix - bash

In other cases, you must explicitly specify the desired shell.

# Artifact fields

Name Type Required Template Description
name string - - Artifact name.
paths string array - + Paths to files that will be placed in one archive.
files string array - + List of paths to files that will be transferred separately.
data arbitrary object - + Artifact data.
send-to object of settings for artifact sending - - Settings for sending artifact data.

# Artifact sending settings fields

Name Type Required Template Description
monq object of send-to monq settings - - Settings for sending artifacts to monq.
api object of request API settings - - Settings for sending artifacts via API request.

# Settings fields for sending artifacts to monq

Name Type Required Template Description
keys string array - - Special system keys.

# Settings fields for sending artifacts via API request

Name Type Required Template Description
method string - + Request method. The default is POST.
uri string - + Request URI.
headers arbitrary dictionary - + Request headers.
query-params arbitrary dictionary - + Request parameters.
media-type string - + Request body type. The default is application/json.

# Minimal script structure

For each YAML script, a minimum structure must be defined in one of the following variations:

jobs:
  - steps:
      - run: echo Hello!
jobs:
  - steps:
      - plugin: pluginName
jobs:
  - steps:
      - plugin: pluginName
        run: echo Hello!

If both the plugin plugin command and the run console command are specified, the plugin will be executed first then the console command will be executed sequentially.

# Templates

Some script fields support templates. In templates, you can refer to different groups of variables.

Several template engines are available:

jobs:
  - steps:
      - run: date /c
        store:
          currentDate: $._outputs.shell
jobs:
  - steps:
      - run: date /c
        store:
          currentDate: {{_outputs.shell}}

# Groups of variables

  • env - environment variables. They have a scope depending on the place in which they are defined.

  • vars — agent task variables. Information about the owner of the task can be transmitted implicitly:

    • stream — data stream:
    {
      "id": 0
      "name": "",
      "key": "00000000-0000-0000-0000-000000000000",
      params: {
        "key": "value"
      }
    }
    
  • storage - global cache variables.

  • System variables:

    • userspaceId is the userspace identifier.
  • outputs - return variables of service job, they have a scope in the corresponding service job.

  • _outputs - return variables of the service job step, they have a scope on the corresponding service job step. At the end of the step, the variables are combined with the variables of the corresponding service job. Contains predefined variables:

    • shell - output of a successfully executed console command.

# Script examples

The following are examples of scripts for use in Data Stream Tasks.

# API example

A script example of an agent Task for making a request to "fake API" jsonplaceholder.typicode.com (opens new window).

---
jobs:
  - steps:
      - run: curl -s https://jsonplaceholder.typicode.com/todos/11
        outputs:
          out_json: $._outputs.shell
    artifacts:
      - data: '{{ outputs.out_json }}'
        send-to:
          API:
            uri: https://monq.example.com/api/public/cl/v1/stream-data
            headers:
              x-smon-stream-key: $.vars.stream.key
              x-smon-userspace-id: $.userspaceId
            media-type: application/json

This script has one execution step, run.

Using run on an external agent, the curl utility is launched with the appropriate parameters.

The result of executing the STDOUT command is stored as an artifact in the out_json variable from the _outputs special group of variables.

The written artifact data in the outputs.out_json variable, using the send-to instruction, is transferred to the current data stream, where:

  • uri - URL of the data stream public API.
  • $.vars.stream.key - API key of the current data stream.
  • $.userspaceId - ID of the current userspace.

After the successful completion of the task, the following entry will appear in the Events & Logs screen:

Image

# Example of collecting information from a system

The given Task example executes the command kubectl get ingresses -n kube-system kubernetes-dashboard -o json on an Agent and passes results of execution in the JSON object source.value to monq.

---
jobs:
   - steps:
       - run: kubectl get ingresses -n kube-system kubernetes-dashboard -o json
         outputs:
           data: $._outputs.shell
     artifacts:
       - data: '{ "value": "{{ outputs.data }}" }'
         send-to:
           API:
             uri: https://monq.example.com/api/public/cl/v1/stream-data
             headers:
               x-smon-stream-key: $.vars.stream.key
               x-smon-userspace-id: $.userspaceId
             media-type: application/json

# Example of running pytest functional tests

The following Task example runs Python Framework pytest to perform functional testing and send results to Testforge.

In this example, sending the allure-result.zip file to API - Testforge is done using Python.

---
jobs:
  - steps:
      - run: rm -rf ${WORKSPACE}/allure-results/* && /opt/venv/bin/py.test -v /opt/tests/demo/docs.monq/test_docs.py --alluredir ${WORKSPACE} /allure-results
        env:
          X_FMONQ_PROJECT_KEY: 97911b15-d756-4376-802d-4ae54ab29354
          X_SMON_STREAM_KEY: 5ba11e94-2152-4428-b9fb-56988090cd71
          MONQ_URL: https://monq.example.com
          WORKSPACE: /opt/workspace
        outputs:
          data: $._outputs.shell
    artifacts:
      - data: '{ "value": "{{ outputs.data }}" }'
        send-to:
          API:
            uri: https://monq.example.com/api/public/cl/v1/stream-data
            headers:
              x-smon-stream-key: $.vars.stream.key
              x-smon-userspace-id: $.userspaceId
            media-type: application/json

# Example of getting problems from Dynatrace

This Task script requests problems from the Dynatrace API and sends them to monq.

---
jobs:
  - steps:
      - run: 'curl -s https://{dynatrace-url}/api/v2/problems --header \"Authorization: Api-Token {token}\"'
        outputs:
          out_json: "$._outputs.shell"
    artifacts:
       - data: "{{ outputs.out_json }}"
         send-to:
            API:
              uri: https://{monq-url}/api/public/cl/v1/stream-data
              headers:
                x-smon-stream-key: $.vars.stream.key
                x-smon-userspace-id: $.userspaceId
              media-type: application/json

In the Events & Logs screen, the result is presented as a structured JSON object.

Image