IMPORTANT: This documentation has been discontinued. Read the updated Run concepts documentation on our new documentation portal.
Deployment is the process of enabling pipelines with previously configured triggers.
This enabling can occur both in the test or prod environment, such as shown in the following figure:
The workflow inside the Digibee Integration Platform is composed of three phases. Have a better understanding of each one of them:
Phase in which the pipelines are built.
Pipelines are constituted by components, which are organized by a logical, sequential or parallel structure so that an integration can be made (e.g.: transform data and send it to ERP, API or DB).
Components are processing units with well-defined roles, for instance, to make a REST call to a HTTP address. They can use resources external to the pipeline, such as accounts and globals, that on top of storing one or more pieces of information, also guarantee more security and reuse.
Second phase, characterized by the pipeline deployment process. At this moment, the pipeline is prepared and made available for consumption or execution.
The pipeline executes according to the previously defined configurations, which determine its control and processing capacity in the environment (test or prod).
In the third and last phase, it is possible to follow the deployed pipelines to analyze, check and track the execution status.
This way, data about the behavior of your pipelines will be available to support management and operation.
It makes it possible to know, for example, the amount of requests executed with success or failure, the response time and the logs.
Have a better understanding of the main Run concepts.
The deployment covers three parts. These are they:
The replicas role is to determine the amount of replicas that will be enabled to attend your integrations with high availability. This guarantees autonomy, concurrent executions and redundancy.
Consumer covers the concept of concurrent executions that each deployed replica supports.
The maximum number of consumers is defined based on three deployment size ranges.
The deployment size is directly related to the processing capacity and the memory of each one of the replicas.
The three deployment size ranges are:
SMALL: 1 to 10 consumers
MEDIUM: 1 to 20 consumers
LARGE: 1 to 40 consumers
For example, if you configure 10 consumers (SMALL) for your pipeline executions, it means that 10 messages can be concurrently processed.
The pipeline execution environments can be test or prod:
The test environment proposal is to evaluate the pipeline construction and, for this matter, it does not have the same attendance characteristics as the production environment. Consequently, you have a working environment that allows free building, where applications can be tested and validated.
When a pipeline is in the production environment and an evolutive change has to be made or if it presents some failure, we suggest your changes be made in the test environment and, only after it, the pipeline can go back to production.