26.04.2025 aktualisiert


Premiumkunde
100 % verfügbarSenior Software Developer / Data Engineer / DevOps
Neufahrn bei Freising, Deutschland
Weltweit
WirtschaftsinformatikSkills
Clean CodeJavaAgile MethodologieTest AutomationCloud ComputingCqrsInformation EngineeringETLDevopsDjangoElasticsearchGithubApache HadoopRekrutierungPythonPostgresqlMachine LearningMikrometerNatural Language ProcessingNeo4JOauthOpenshiftOracle FinancialsScrumRedisOpenid ConnectAnsiblePrometheusClouderaMesosStatistikenTypescriptTest-Driven DevelopmentGrafanaApache SparkSpringbootDeep LearningCobraFastapiKotlinKanbanAngularJSKubernetesCassandraApache KafkaSuchmaschinenGraphqlTerraformDomain Driven DesignRefactoringDockerJenkinsMicroservices
Kurze Zusammenfassung meiner Erfahrung:
Devops Tools:
Datenbanken:
Plattform:
Technologien:
KI:
- Java, Python > 14 Jahre
- Spring > 8 Jahre
- Kafka > 7 Jahre
- Spark, Hadoop > 9 Jahre
- Software-Architektur > 7 Jahre
- Cloud-Umgebungen (Kubernetes, Openshift, Mesos) > 6 Jahre
- Machine Learning (NLP, statistical ML) > 6 Jahre
- Agile Entwicklung (Scrum) > 10 Jahre
- Java
- Python
- Kotlin
- Go
- Typescript
- Springboot
- Micronaut
- Kafka-Streams
- Micrometer(Monitoring)
- Angular
- Spark
- Cobra(Go)
- Operator-SDK(Kubernetes, Go)
- Django(Python)
- FastAPI(Python)
- Clean Code
- Refactoring
- Test Driven Development
- Agile Development (Scrum, Kanban)
- Cloud Native Development
- Domain Driven Design
- DevOps
- Test Automation
Devops Tools:
- Ansible
- Kustomize
- Helm
- Jenkins
- ArgoCD
- JFrog Artifactoy
- Github
- Terraform
- Keycloak
- Prometheus
- Grafana
Datenbanken:
- Neo4j
- Elasticsearch
- Postgres
- Redis
- Oracle
- Cassandra
Plattform:
- Kubernetes
- Openshift
- Kafka
- Docker
- Cloudera(Hadoop)
Technologien:
- Webentwicklung
- Oauth2
- OpenID Connect
- Realtime-ETL
- Event-Sourcing(CQRS)
- GraphQL
- Cypher
- Microservice
KI:
- Natural Language Processing
- Deep Learning
- Statistical Machine Learning
- Recommendation Engine
- Search Engine
Sprachen
ChinesischMutterspracheDeutschverhandlungssicherEnglischgut
Projekthistorie
Description:
This project involved the evaluation of numerous cutting edge technologies. This
new CRM-System is based on Event Sourcing/CQRS Design Pattern, which
resulted in a significant increase in data transfer speed. It enables transaction
data transfer from monthly to seconds time interval. By using microservice
architecture, the complexity of software development was also significantly
reduced and simplified the product development life cycle by applying automated
CI/CD pipeline.
Kafka streaming platform enables recording of all user generated transactional
data of contract development and processes them in real time.
Key deliveries:
- Design and implementing of Kafka based real-time event sourcing system
- Implementing Kappa architecture
- Implementing Kerberos authenticated security client for backend data transfer
- Implementing Oauth with Keycloak for Microservice authentication and
authorization
- Implementing GraphQL Oauth Security library for backend authorization
- Design and implement Kubernetes-Operator for installing production-like
infrastructure on developer machines
- Implementing library for simplifying Micronaut and Kafka-Streams based
application-monitoring
- Evaluating and Implementing Graph service with Neo4j and Kafka for tracking
and analyzing realtime events in event sourcing system
- Build CI/CD Pipeline with Jenkins, Go(Cobra), Python
- DevOps and Third-Level Support
- Technical consulting
Tools: Kafka, Openshift, Keycloak, Springboot, Micronaut, Kotlin, Go, Python,
Shell-Script, Jenkins, Keycloak, Operator-SDK, Cypher, GraphQL, Kustomize,
Helm
- Implementing Kappa architecture
This project involved the evaluation of numerous cutting edge technologies. This
new CRM-System is based on Event Sourcing/CQRS Design Pattern, which
resulted in a significant increase in data transfer speed. It enables transaction
data transfer from monthly to seconds time interval. By using microservice
architecture, the complexity of software development was also significantly
reduced and simplified the product development life cycle by applying automated
CI/CD pipeline.
Kafka streaming platform enables recording of all user generated transactional
data of contract development and processes them in real time.
Key deliveries:
- Design and implementing of Kafka based real-time event sourcing system
- Implementing Kappa architecture
- Implementing Kerberos authenticated security client for backend data transfer
- Implementing Oauth with Keycloak for Microservice authentication and
authorization
- Implementing GraphQL Oauth Security library for backend authorization
- Design and implement Kubernetes-Operator for installing production-like
infrastructure on developer machines
- Implementing library for simplifying Micronaut and Kafka-Streams based
application-monitoring
- Evaluating and Implementing Graph service with Neo4j and Kafka for tracking
and analyzing realtime events in event sourcing system
- Build CI/CD Pipeline with Jenkins, Go(Cobra), Python
- DevOps and Third-Level Support
- Technical consulting
Tools: Kafka, Openshift, Keycloak, Springboot, Micronaut, Kotlin, Go, Python,
Shell-Script, Jenkins, Keycloak, Operator-SDK, Cypher, GraphQL, Kustomize,
Helm
- Implementing Kappa architecture
Description: Building and Design a Cloudera Hadoop based Data Platform with
multi-tenant features.
Key deliveries:
- Technical project management
- Design and implementing real-time data processing pipeline
- Implementing part of machine learning use cases in production
- Platform administration
Tools: Cloudera Data Science Workbench, Openshift Container Platform, Spark
multi-tenant features.
Key deliveries:
- Technical project management
- Design and implementing real-time data processing pipeline
- Implementing part of machine learning use cases in production
- Platform administration
Tools: Cloudera Data Science Workbench, Openshift Container Platform, Spark
Lake (Client project at MAN Bus & Trucks)
Company: Reply AG
Position: Senior Software Engineer / DevOps
Description:
This platform is based on Cloudera 5.9 and enables the storage of cross-source
system data. By using docker technology it simplified accessing company wide
data via sandbox and performing data analytics. Kafka was used as central data
hub for real time data ingestion from multiple source systems i.e. DB transaction
data, log files etc..
Key deliveries:
- Cluster Operation with Cloudera Manager, Marathon etc.
- Data ingestion with Sqoop and Kafka
- Creating Docker images for Microservices
- Implementing ETL jobs for data migration
Tools: Cloudera 5.9 Stack, Mesos, Marathon, Docker, Calico, Confulent Kafka,
ElasticSearch, Kibana, GitLab, GitHub, GlusterFS, Consul, Chronos, Jupyter, R
Shiny Server, Java, Python, Shell, Anisble, Intellij, etc.
Company: Reply AG
Position: Senior Software Engineer / DevOps
Description:
This platform is based on Cloudera 5.9 and enables the storage of cross-source
system data. By using docker technology it simplified accessing company wide
data via sandbox and performing data analytics. Kafka was used as central data
hub for real time data ingestion from multiple source systems i.e. DB transaction
data, log files etc..
Key deliveries:
- Cluster Operation with Cloudera Manager, Marathon etc.
- Data ingestion with Sqoop and Kafka
- Creating Docker images for Microservices
- Implementing ETL jobs for data migration
Tools: Cloudera 5.9 Stack, Mesos, Marathon, Docker, Calico, Confulent Kafka,
ElasticSearch, Kibana, GitLab, GitHub, GlusterFS, Consul, Chronos, Jupyter, R
Shiny Server, Java, Python, Shell, Anisble, Intellij, etc.