20.10.2025 aktualisiert


100 % verfügbar
Cloud Architect and Data Architect
Basel, Schweiz
Weltweit
M. Sc. Data ScienceSkills
Cloud architect, Senior Cloud Engineer, Senior Data Engineer, Team Lead, ML DevOps, Graphdatenbank, Kubernetes, Java, Python, NET, DevOps, test driven development, Confluence, Jira, Azure, AWS, Lambda, Azure Pipelines, AWS EMR, AWS Glue, Azure Data Factory, Azure Machine Learning, AKS / EKS, ADLS Gen2 / AWS S3, Data(base), SQL, Neo4j, Cosmos DB, Databricks / Spark, Kafka, DVC, SSDT, Delta Lake, Cassandra, .NET 6, Scala, Orchestration, versioning and tracking, Terraform, Kubernetes, Docker, Ansible, MLFlow, Argo, Luigi, Airflow, git, Github, Github Actions, Visualization, Power BI, Grafana, Machine learning, algorithm, Regressions, SVM, ensemble methods, XGboost, RM, clustering, KNN, naïve Bayes, graphic models, Deep learning, Neural Network, NLP, Autoencoder, Bayesian, MCMC, correspondence analysis, Spark, data ingestion, data visualization, Databricks, algorithms, DB, Azure Functions, AKS, big data, DWH, batch processing, Hadoop, backend, Excel, Node.js, R, Shiny, GCP, Azure, AWS
Sprachen
ChinesischMutterspracheDeutschverhandlungssicherEnglischverhandlungssicherFranzösischverhandlungssicherItalienischGrundkenntnisse
Projekthistorie
In order to migrate a classical machine learning model to Azure, which aims to predict the vehicle repair rate, I interpreted the Python project into Spark and used Databricks, MLflow and Azure Data Factory to automate the data ingestion and to parallelize the model training. Front-end data visualization uses Grafana. Tools: Azure Data Factory, Databricks, Spark, Delta Lake, Azure Pipelines, Grafana.
Kitchen furniture blueprints sometimes contain errors which should be avoided before being sent to construction. As a
technical team working with a kitchen provider, we received thousands of kitchen plans on a daily basis. I designed a graph model to represent the furniture of a kitchen, geometric and graph algorithms to learn and look for errors in them, and a pipelined workflow on Azure to process the data. Kitchen blueprints as data are first fed into the workflow by Kafka in real time, these are then validated and sent to a Neo4j cluster by an Azure function. After the kitchen blueprints get ingested and analyzed by Neo4j, the result is sent to the Cosmos DB and returned to the client, in case an error is detected or a warning occurred. Tools: knowledge graph, Kafka, Neo4j, Azure Functions, AKS, Python and .NET.
technical team working with a kitchen provider, we received thousands of kitchen plans on a daily basis. I designed a graph model to represent the furniture of a kitchen, geometric and graph algorithms to learn and look for errors in them, and a pipelined workflow on Azure to process the data. Kitchen blueprints as data are first fed into the workflow by Kafka in real time, these are then validated and sent to a Neo4j cluster by an Azure function. After the kitchen blueprints get ingested and analyzed by Neo4j, the result is sent to the Cosmos DB and returned to the client, in case an error is detected or a warning occurred. Tools: knowledge graph, Kafka, Neo4j, Azure Functions, AKS, Python and .NET.
Astronomical data can be immense and requires a big data solution to process it. Working together with astronomers, I
designed a Spark application for an on premise DWH to process the data and read in from and write out to its different layers. The application includes both batch processing to deal with existent data and streaming to deal with new data coming in. I also optimized the Spark application at different levels which has improved its performance remarkably. Tools: Spark 2.x, Kafka, Hadoop, Java.
designed a Spark application for an on premise DWH to process the data and read in from and write out to its different layers. The application includes both batch processing to deal with existent data and streaming to deal with new data coming in. I also optimized the Spark application at different levels which has improved its performance remarkably. Tools: Spark 2.x, Kafka, Hadoop, Java.