04.11.2025 aktualisiert


Data Engineer
Cologne, Deutschland
Deutschland +2
BA International BusinessSkills
Data EngineeringData WarehouseSnowflakePythonSQLGoogle Big QueryPostgresSQLMS SQLData PlatformData PipelinesMS SQL ServerData AnalyticsData WarehousingData ModellingGoogle Cloud PlattformData VisualizationDatenmodellierungETL DWHETL Clover
- PostgreSQL / SQL Server, MySQL
- Snowflake
- Oracle
- Google Cloud Platform
- Google BigQuery
- Cloud Composer
- Looker / Google Data Studio
- Microsoft Azure
- Azure DevOps
- Azure Data Factory
- Python
- dlt
- dbt
- Airflow
- Prefect
- Business Intelligence Tools
- Metabase
- Microsoft Excel
- Google Sheets
- Tableau
- PowerBI
- Supermetrics
- Jira / Confluence
- Git
- Pipedrive
Sprachen
DeutschMutterspracheEnglischverhandlungssicherRussischgutSpanischgut
Projekthistorie
• Designing and optimizing modern data architectures
• Developing and maintaining ETL/ELT pipelines (Azure Data Factory, SSIS, PowerBI, MS SQL)
• Implementing reporting solutions (PowerBI & SSRS) and data models based on the Kimball methodology
• Developing and maintaining ETL/ELT pipelines (Azure Data Factory, SSIS, PowerBI, MS SQL)
• Implementing reporting solutions (PowerBI & SSRS) and data models based on the Kimball methodology
- Orchestrated DBT scripts and Python code to empower CRM specialists in executing high-impact marketing campaigns efficiently.
- Used Snowflake, Prefect and DBT for the execution.
- Driving the centralization of accounting and finance data from diverse international countries utilizing various source systems (MS SQL, MS Dynamics, web APIs)
- Developing resilient data pipelines tailored for integration with IBM Cognos, ensuring seamless data flow and accessibility
- Creating a comprehensive data model with a focus on flexibility, enabling adaptable solutions for future needs in the dynamic business landscape