Run Configurations
Execute Jobs / Transformations on specific nodes or in a Pentaho Cluster ..
Last updated
Was this helpful?
Execute Jobs / Transformations on specific nodes or in a Pentaho Cluster ..
Last updated
Was this helpful?
Pentaho Data Integration provides advanced clustering and partitioning capabilities that allow organizations to scale out their data integration deployments.
In this guided demonstration, you will:
• Configure Master & Slave Nodes
• Execute RUN Configurations
So lets start scaling out by adding some servers (nodes).
These can be defined as either:
• Master: node is responsible for distributing work among the worker nodes and ensuring high availability and scalability of the system.
• Slave (Worker): node in Pentaho is an instance that can execute Pentaho work items, such as PDI jobs and transformations, with parallel processing, dynamic-scalability, load-balancing, and dependency-management in a clustered environment
You can indiviually start the carte instances or execute the following command to deploy all 3 at the same time .
In a terminal execute the following command.
In a new terminal execute the following command (Slave A).
In a new terminal execute the following command (Slave B).
You should now have 3 terminals, each running a Carte instance.
Please dont close the terminals ..!