07.07.2025 aktualisiert


Premiumkunde
nicht verfügbarSenior Software Developer / Machine Learning Engineer / / Python
Darmstadt, Deutschland
Weltweit
M.Sc. Computer Science - TU DarmstadtSkills
[ENG]
Welcome to my profile! My journey has taken me through diverse projects encompassing core software development, the intricacies of machine learning model creation, lifecycle management, and comprehensive data analysis. My dedication to every phase, from conceptualization to execution, has positioned me at the vanguard of technology, ensuring timely and innovative solutions for clients. I welcome new opportunities and challenges, so please don't hesitate to reach out.
[DE]
Willkommen auf meinem Profil! Meine Reise führte mich durch vielfältige Projekte, von der Grundlage der Softwareentwicklung bis hin zu den Feinheiten der Erstellung von Machine-Learning-Modellen, Lebenszyklusmanagement und umfassender Datenanalyse. Meine Hingabe in jeder Phase, von der Konzeption bis zur Umsetzung, hat mich zu den neusten Technologien geführt und gewährleistet pünktliche und innovative Lösungen für meine Kunden. Ich freue mich über neue Gelegenheiten und Herausforderungen, also zögern Sie bitte nicht, Kontakt aufzunehmen.
Core Competencies:
Professional Experience:
Welcome to my profile! My journey has taken me through diverse projects encompassing core software development, the intricacies of machine learning model creation, lifecycle management, and comprehensive data analysis. My dedication to every phase, from conceptualization to execution, has positioned me at the vanguard of technology, ensuring timely and innovative solutions for clients. I welcome new opportunities and challenges, so please don't hesitate to reach out.
[DE]
Willkommen auf meinem Profil! Meine Reise führte mich durch vielfältige Projekte, von der Grundlage der Softwareentwicklung bis hin zu den Feinheiten der Erstellung von Machine-Learning-Modellen, Lebenszyklusmanagement und umfassender Datenanalyse. Meine Hingabe in jeder Phase, von der Konzeption bis zur Umsetzung, hat mich zu den neusten Technologien geführt und gewährleistet pünktliche und innovative Lösungen für meine Kunden. Ich freue mich über neue Gelegenheiten und Herausforderungen, also zögern Sie bitte nicht, Kontakt aufzunehmen.
Core Competencies:
- Algorithm Implementation: Efficiently implements computationally intensive algorithms for scalability.
- Market Analysis: Specializes in time series and stock market analytics.
- Deep Learning Expertise: Proficient in deep learning and ensuring numerical stability.
- Advanced Data Analytics: Experienced in Bayesian optimization and analyzing high-dimensional, non-linear data.
- Data Visualization: Capable of visualizing, presenting, and documenting complex data and tasks.
- Process Automation: Strong in automating processes and verifying complex algorithms.
- Research Aptitude: Demonstrates effectiveness in research environments.
- Reinforcement Learning: Knowledgeable in reinforcement learning with education from IAS Darmstadt, TU Darmstadt.
- Advanced Modelling: Experienced in multi-armed bandits, model predictive control, and probabilistic modelling.
- Research Trends: Stays current with latest research developments.
- Feature Extraction & Analysis: Proficient in feature extraction, clustering, and classification algorithms.
- Version Control: Competent in using Git and other versioning systems.
- Tool Proficiency: Skilled in various analytical tools and frameworks.
Professional Experience:
- Quantitative Analyst/Machine Learning Scientist, Privately Managed Fund: Utilized machine learning techniques for fund management.
- Developer, Analytics Dashboard: Developed an analytics dashboard using Django, Flask, and MongoDB.
- Database Developer: Designed and developed a relational database (PostgreSQL) with API access via FastAPI. Security standards according to Azure best-practices.
- Machine Learning Engineer: Built a robust stock analytics algorithm for real-time anomaly detection and reporting (PyTorch, Pandas).
- Research Associate: Solved a non-linear optimization task using deep learning architectures for an LED lamp setting optimization project (Tensorflow, Matlab).
- Robotics Data Scientist: Worked extensively with robotics data and implemented classic control and reinforcement learning algorithms.
- Data Scientist, Twitter Sentiment Analysis: Conducted real-time sentiment analysis on Twitter data using spaCy.
- Data Engineer, Multiple Projects: Developed several data pipelines with distributed processing and data provisioning via Rest API access (Google Cloud, FastAPI, Docker, Terraform).
- Time Series Data Specialist: Led numerous projects involving time series data.
- Software Development: Developed several software projects from initial design to operations including CI/CD pipelines, Kubernetes deployment, integrating MLOps production principles and monitoring the live runtime.
- Cloud: AWS, Azure (DevOps, AI/ML, Buckets), Google Cloud
- DevOps & Infra: ArgoCD, Terraform, Docker, Kubernetes
- CI/CD & VCS: Git, GitHub Actions, GitLab CI/CD
- APIs & Web: FastAPI, Flask
- Data & DBs: SQL, PostgreSQL, Pandas, Numpy, Jupyter
- ML/AI: Scikit-learn, TensorFlow / TF Probability, Keras, PyTorch, MLFlow, JAX, Pyro, Stan
- Streaming: Kafka
- Language: Python
Sprachen
DeutschMutterspracheEnglischverhandlungssicherSpanischGrundkenntnisse
Projekthistorie
Conceptual Design & Implementation:
Led the development of a robust inference service capable of deploying and executing machine learning models in production across various types. Designed and implemented model orchestration for real-time operations, enabling seamless execution and switching of models in live environments. Focused on scalability, reliability, and integration into existing systems.
DevOps Practices:
Employed DevOps methodologies using Kubernetes, FastAPI, Kafka, and AzureML to ensure streamlined development, deployment, and monitoring of ML solutions. Integrated live monitoring and alarming via Grafana and OpenTelemetry for continuous operational oversight.
Machine Learning:
Developed forecasting models and comprehensive model training and validation pipelines (MLFlow, AzureML). Established a model lifecycle management system with robust versioning to support continuous improvement and controlled deployment of newly trained models. The models were mainly trained with produced solar/wind energy and provide forecasts.
Project Rollout:
Drove the productization of ML models, including live monitoring and alerting mechanisms, supporting their integration into production environments with minimal disruption. The solution is running today to support operation in a control center for high voltage grids to
Team Collaboration:
Took an active role as Tech Lead, guiding a cross-functional team through collaborative development and agile ceremonies within a Scrum framework. Facilitated architectural decision-making through technical discussions across domains.
Innovative Problem-Solving:
Developed a domain-specific data quality service that validates incoming data for consistency and relevance, ensuring that downstream ML processes operate on high-quality inputs.
Stakeholder Engagement:
Regularly engaged with stakeholders to align technical implementations with strategic objectives, ensuring that modeling efforts directly support organizational goals and operational requirements.
Tech Stack:
MLFow, Kafka, ArgoCD, Helm, Kubernetes, Docker, AzureDevOps, Azure ML/AI, Azure Registry, Postgres, TimescaleDB, Grafana, OpenTelemetry, S3 Buckets, Python, .net, Pandas, PyTorch, Numpy, FastAPI, SonarQube, scikit-learn
Led the development of a robust inference service capable of deploying and executing machine learning models in production across various types. Designed and implemented model orchestration for real-time operations, enabling seamless execution and switching of models in live environments. Focused on scalability, reliability, and integration into existing systems.
DevOps Practices:
Employed DevOps methodologies using Kubernetes, FastAPI, Kafka, and AzureML to ensure streamlined development, deployment, and monitoring of ML solutions. Integrated live monitoring and alarming via Grafana and OpenTelemetry for continuous operational oversight.
Machine Learning:
Developed forecasting models and comprehensive model training and validation pipelines (MLFlow, AzureML). Established a model lifecycle management system with robust versioning to support continuous improvement and controlled deployment of newly trained models. The models were mainly trained with produced solar/wind energy and provide forecasts.
Project Rollout:
Drove the productization of ML models, including live monitoring and alerting mechanisms, supporting their integration into production environments with minimal disruption. The solution is running today to support operation in a control center for high voltage grids to
Team Collaboration:
Took an active role as Tech Lead, guiding a cross-functional team through collaborative development and agile ceremonies within a Scrum framework. Facilitated architectural decision-making through technical discussions across domains.
Innovative Problem-Solving:
Developed a domain-specific data quality service that validates incoming data for consistency and relevance, ensuring that downstream ML processes operate on high-quality inputs.
Stakeholder Engagement:
Regularly engaged with stakeholders to align technical implementations with strategic objectives, ensuring that modeling efforts directly support organizational goals and operational requirements.
Tech Stack:
MLFow, Kafka, ArgoCD, Helm, Kubernetes, Docker, AzureDevOps, Azure ML/AI, Azure Registry, Postgres, TimescaleDB, Grafana, OpenTelemetry, S3 Buckets, Python, .net, Pandas, PyTorch, Numpy, FastAPI, SonarQube, scikit-learn
- Conceptual Design & Implementation: Focused on the development of scalable software solutions for monitoring the German electrical grid.
- DevOps Practices: Employing DevOps methodologies for efficient and effective project lifecycle management.
- Machine Learning & Rule-Based Systems: Integrating advanced machine learning techniques and rule-based mechanisms for enhanced grid analysis.
- Project Rollout: Leading the rollout of smart energy initiatives across Germany.
- Team Collaboration: Working within a Scrum framework, promoting agile and collaborative team dynamics.
- Innovative Problem-Solving: Driving innovation in tackling complex challenges in grid monitoring.
- Stakeholder Engagement: Actively communicating with stakeholders to align software development with organizational goals and grid requirements.
- SDK Development: Creating a Python SDK for internal sensor data access, optimized for data scientists.
- User-Centric Design: Ensuring the SDK is intuitive and meets the specific needs of data scientists.
- Authentication Standards: Implementing secure authentication aligned with Azure standards.
- Best Practice Adoption: Establishing the SDK as a Mono Repository exemplar for Python projects in the organization.
- Documentation & Support: Providing comprehensive documentation and support for the SDK users.
- Continuous Integration/Continuous Deployment (CI/CD): Implementing CI/CD practices for efficient development and deployment of the SDK.
- Collaboration & Feedback Integration: Collaborating with end-users and stakeholders to continuously refine and enhance the SDK functionalities.

exali Berufshaftpflicht-Siegel
Das original exali Berufshaftpflicht-Siegel bestätigt dem Auftraggeber, dass die betreffende Person oder Firma eine aktuell gültige branchenspezifische Berufs- bzw. Betriebshaftpflichtversicherung abgeschlossen hat.
Versichert bis: 01.07.2026