Logging
Implement database logging ..
PDI is configured to provide helpful log messages to help provide understanding in how a job or transformation is running. Logging can be configured to provide minimal logging information, just to know whether a job or transformation failed or was successful, or detailed in providing errors or w arnings such as network issues or mis-configurations.
• Nothing: Don't show any output
• Error: Only show errors
• Minimal: Only use minimal logging
• Basic: This is the default basic logging level
• Detailed: Give detailed logging output
• Debug: For debugging purposes, very detailed output.
• Row level: Logging at a row level, this can generate a lot of data.
When executing a job or transformation from within the Spoon development environment, a "Logging" tab is available, showing any log messages that have been generated. Any error messages are shown with red text, to easily identify the cause of any errors in the file.
In this guided demonstration, you will:
• Create Logging Database Tables in PostgreSQL hibernate database.
• Configure kettle.properties file for logging
• Examine Logging tables
For this demonstration, you will require pgAdmin 4 (or any database management tool) to create the required schema and database tables in PostgreSQL.
Ensure that the Pentaho server is up and running ..!
Create a connection to the hibernate database:

Username
hibuser
Password
password
Test the connection.
Expand the hibernate database -> Schemas -> pentaho_dilogs -> Tables:

The script has already been executed and the logging tables created ..
The scripts for the supported databases are located at:
/home/pentaho/Pentaho/server/pentaho-server/[database folder]
To enable logging you will need to edit the kettle.properties file to include the 'logging ' variables.
Open the kettle.properies file in a text editor.
cd
cd ~/.kettle
sudo nano kettle.properties
There are 3 properties to set (added in PDI 10+):
• name of table
• name of database connection
• name of database schema
KETTLE_CHANNEL_LOG_TABLE=channel_logs
KETTLE_CHANNEL_LOG_DB=live_logging_info
KETTLE_CHANNEL_LOG_SCHEMA=pentaho_dilogs
KETTLE_METRICS_LOG_TABLE=metrics_logs
KETTLE_METRICS_LOG_DB=live_logging_info
KETTLE_METRICS_LOG_SCHEMA=pentaho_dilogs
KETTLE_TRANS_LOG_TABLE=trans_logs
KETTLE_TRANS_LOG_DB=live_logging_info
KETTLE_TRANS_LOG_SCHEMA=pentaho_dilogs
KETTLE_JOBENTRY_LOG_TABLE=jobentry_logs
KETTLE_JOBENTRY_LOG_DB=live_logging_info
KETTLE_JOBENTRY_LOG_SCHEMA=pentaho_dilogs
KETTLE_JOB_LOG_TABLE=job_logs
KETTLE_JOB_LOG_DB=live_logging_info
KETTLE_JOB_LOG_SCHEMA=pentaho_dilogs
KETTLE_STEP_LOG_TABLE=step_logs
KETTLE_STEP_LOG_DB=live_logging_info
KETTLE_STEP_LOG_SCHEMA=pentaho_dilogs
KETTLE_TRANS_PERFORMANCE_LOG_TABLE=transperf_logs
KETTLE_TRANS_PERFORMANCE_LOG_DB=live_logging_info
KETTLE_TRANS_PERFORMANCE_LOG_SCHEMA=pentaho_dilogs
Save and reboot.
No need to reboot if you're using Pentaho 10.0 +. The logging variables have already been added to the kettle.prperties file.
Ensure you have the Pentaho server up and running ..!!
This demonstration just gets you up and running with logging your Jobs and transformations to a bunch of loging tables. Its tricky to get any useful info out of these tables without resorting to some complex SQL querries.
Luckily .. Pentaho came up with Operations Mart. This is a centralized data mart that enables you to query the tables and display the info in dashboards as charts and reports.
In PDI open the tr_hello_world.ktr

Now we need to define a connection to the Pentaho Repository.

Click on the 'Connect'button located in the top right corner.
Click on Add.
Select Pentaho Repository (Recommended)

Enter the following details.

Save and Close
Connect to: 'Pentaho' Repository with the following credentials:

Once connected, don't close open transformation.
Save the tr_hello_world.ktr to Public -> folder. You will need to create the folder.

Execute the transformation.
In DBeaver: open and view the data in the trans_log table.

Last updated