Pentaho Data Integration
InstallationBusiness AnalyticsCToolsData CatalogData QualityLLMs
  • Overview
    • Pentaho Data Integration ..
  • Data Integration
    • Getting Started
      • Configuring PDI UI
      • KETTLE Variables
    • Concepts & Terminolgy
      • Hello World
      • Logging
      • Error Handling
    • Data Sources
      • Flat Files
        • Text
          • Text File Input
          • Text File Output
        • Excel
          • Excel Writer
        • XML
          • Read XML
        • JSON
          • Read JSON
      • Databases
        • CRUID
          • Database Connections
          • Create DB
          • Read DB
          • Update DB
          • Insert / Update DB
          • Delete DB
        • SCDs
          • SCDs
      • Object Stores
        • MinIO
      • SMB
      • Big Data
        • Hadoop
          • Apache Hadoop
    • Enrich Data
      • Merge
        • Merge Streams
        • Merge Rows (diff)
      • Joins
        • Cross Join
        • Merge Join
        • Database Join
        • XML Join
      • Lookups
        • Database Lookups
      • Scripting
        • Formula
        • Modified JavaScript Value
        • User Defined Java Class
    • Enterprise Solution
      • Jobs
        • Job - Hello World
        • Backward Chaining
        • Parallel
      • Parameters & Variables
        • Parameters
        • Variables
      • Scalability
        • Run Configurations
        • Partition
      • Monitoring & Scheduling
        • Monitoring & Scheduling
      • Logging
        • Logging
      • Dockmaker
        • BA & DI Servers
      • Metadata Injection
        • MDI
    • Plugins
      • Hierarchical Data Type
  • Use Cases
    • Streaming Data
      • MQTT
        • Mosquitto
        • HiveMQ
      • AMQP
        • RabbitMQ
      • Kafka
        • Kafka
    • Machine Learning
      • Prerequiste Tasks
      • AutoML
      • Credit Card
    • RESTful API
    • Jenkins
    • GenAI
  • SETUP
    • Windows 11 Pentaho Lab
  • FAQs
    • FAQs
Powered by GitBook
On this page
  1. Data Integration
  2. Concepts & Terminolgy

Logging

Set the transformation logging level ..

PreviousHello WorldNextError Handling

Last updated 1 month ago

Workshop - Logging

In this workshop we'll take a look at the basic logging options. The degree of logging can be set from ‘Nothing’ to ‘Row Level’.

  • Change Metadata Data Type

  • Examine the various levels of logging.


Create a new Transformation

Any one of these actions opens a new Transformation tab for you to begin designing your transformation.

  • By clicking File > New > Transformation

  • By using the CTRL-N hot key

Generate Rows

Generate rows outputs a specified number of rows. By default, the rows are empty; however, they can also contain several static fields. This step is used primarily for testing purposes. It may be useful for generating a fixed number of rows, for example, if you require exactly 12 rows for 12 months.

Sometimes you may use Generate Rows to generate one row that is an initiating point for your transformation.

To create an error, change the Type for the field, message, from String to Integer, then view the Execution Results > Logging tab.

  1. Double-click on the Generate Rows Step and change the Type, as illustrated below.

  1. Click OK.

RUN the Transformation

This final part of the creating a transformation, executed locally.

  1. Click the RUN button in the Canvas Toolbar.

  2. Change the Log level from Basic to Row Level.

  1. Click Run. The misconfigured Step is highlighted.

  1. Click on the Log tab in the Execution Results pane.

Its a bit of an art .. but the reason for the error is in there..!

You can display just the logging 'error' by clicking on the minus icon.

The ERROR is also logged in:

Windows

C:\Pentaho\design-tools\data-integration\logs\pdi.log

Linux

~/Pentaho/design-tools/data-integration/logs/pdi.log

➡️ Next:

Finally RUN the transformation
Logging
Change data type
Set row level debugging
error in step
logging - error
pdi.log - Linux