For this project used Pentaho Data Integration/Kettle to design all ETL processes to extract data from various sources including external files, cleanse and then load the data into target data warehouse. Created transformation that involve configuring the following steps: Table input, Table output, Text file output, CSV file input, Insert/Update, add constants, Filter, Value Mapper, Stream Lookup, Sort rows, Database Lookup, Set Environment variables.
-
Notifications
You must be signed in to change notification settings - Fork 0
For this project used Pentaho Data Integration/Kettle to design all ETL processes to extract data from various sources including external files, cleanse and then load the data into target data warehouse. Created transformation that involve configuring the following steps: Table input, Table output, Text file output, CSV file input, Insert/Update…
arpit1404/ETL_Transformation_using_Pentaho
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
For this project used Pentaho Data Integration/Kettle to design all ETL processes to extract data from various sources including external files, cleanse and then load the data into target data warehouse. Created transformation that involve configuring the following steps: Table input, Table output, Text file output, CSV file input, Insert/Update…
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published