Text

Ingesting Text Files ..

CSV & TXT

Onboarding / Reading from CSV, TXT files can be tricky ..

File Format Issues:

  • Inconsistent delimiters (e.g., mixing commas and tabs)

  • Incorrect line endings (Windows vs. Unix style)

  • Unexpected character encoding (e.g., UTF-8 vs. ASCII)

Data Quality Problems:

  • Missing values or incomplete record

  • Inconsistent data types within columns

  • Duplicate records

Header Row Handling:

  • Presence or absence of header row

  • Misaligned headers with data columns

Dynamic File Names:

  • Difficulty in handling files with changing names or timestamps

Complex Data Structures:

  • Nested data or hierarchical information in flat files

  • Multi-line records

Data Type Conversion:

  • Improper automatic type inference

  • Date and time format inconsistencies

Workshops

The Text File Input step in Pentaho Data Integration (PDI) allows you to extract data from various text file formats like CSV, fixed-width, and delimited files. It offers capabilities for handling headers, footers, compression, and complex file patterns, making it ideal for importing raw data from external systems, log files, or legacy exports.

The Text File Output step enables you to export transformation results to text files with configurable formats, delimiters, and compression options. This step is frequently used for creating reports, generating data exchange files for other systems, archiving processed data, and creating backup files. Together, these components form the foundation of many ETL workflows by facilitating seamless data import and export operations.

Text File Input

This simple workflow introduces some of the key steps used for ingesting text files. The use case refers to to text files where you need to:

  • change the layout / structure

  • extract key values - create new data stream fields

  • string cut or replace values

  • change the data stream field type

Text File Input
Text File Input

Last updated

Was this helpful?