07.03.2024 aktualisiert


20 % verfügbar
Data Engineer/ ETL Developer
Munchen, Deutschland
Deutschland +2
General IT ApprenticeshipSkills
DEV OPSETL DeveloperOperationsSQL/PLSQLDWHDWH-EntwicklerTesting/QA/Rollout/SupportData EngineeringSnowflake Datavault 2.0oracle 12cOracle Data Integrator 12
Data Mapping/Transformation with Staging/Core/Mart layers
Data Engineering with Snowflake/Oracle/DB2/MSQL/PostgreSQL
Datavault 2.0 Model
Talend and SAP BODS ETL Tools
Development Talend ETL Tool
Python with Pandas
SQL/PLSQL, Oracle, PostgreSQL, DB2, Informix DB
Oracle 12c,
Oracle Data Integrator 12,
Agile Scrum, Jira, Remedy, Clearquest,Jenkins, Python, Github, CI/CD,
Problem managment, Application managment, Operations 2-3rd Level,
FTP tools, Batch processing, Cron, Job Scheduling, File Transfer,
Testing HP ALM, QTP, Test Automation, Selenium,
Linux, Shell scripting, Bash/Pearl,
Billing Systems: Kenan Arbor, Rator, Amdocs,
Mediation CDR/SDR processing and Rating,
xml,csv and Reporting
Data Engineering with Snowflake/Oracle/DB2/MSQL/PostgreSQL
Datavault 2.0 Model
Talend and SAP BODS ETL Tools
Development Talend ETL Tool
Python with Pandas
SQL/PLSQL, Oracle, PostgreSQL, DB2, Informix DB
Oracle 12c,
Oracle Data Integrator 12,
Agile Scrum, Jira, Remedy, Clearquest,Jenkins, Python, Github, CI/CD,
Problem managment, Application managment, Operations 2-3rd Level,
FTP tools, Batch processing, Cron, Job Scheduling, File Transfer,
Testing HP ALM, QTP, Test Automation, Selenium,
Linux, Shell scripting, Bash/Pearl,
Billing Systems: Kenan Arbor, Rator, Amdocs,
Mediation CDR/SDR processing and Rating,
xml,csv and Reporting
Sprachen
DeutschverhandlungssicherEnglischMuttersprache
Projekthistorie
- Developed a new Smart HR solution by extracting data in blob form from Azure Container using Python.
- Transformed the extracted data into data frames for further processing.
- Imported the transformed data into the Oracle Staging layer.
- Utilized the SAP BODS ETL tool for additional data transformation and integration.
- Developed and implemented Talend jobs using Snowflake components for efficient ELT data processing.
- Transferred large volumes of data from the Staging area to the Data Lake and Core layers of Snowflake Data Warehouse.
- Performed Data Engineering tasks, including deduplication and historization, to ensure data integrity and reliability using Data vault 2.0.
- Leveraged Talend connectors and applied complex business logic to extract data from upstream interfaces.
- DWH development of data tables for migration
- Talend ETL job development for Data Engineering
- Rest API with Talend ESB for extraction of data from STARC
- Tableau development for data visualization and reporting
- Operation Azure Cloud Repository for Talend Project