Overview

Description

Fulltime remote

The IDDI (Instant Dynamic Data Integration) project aims at providing a central platform for the

event-based exchange of data and/or business objects between different IT systems within the

DB to implement.

The introduction of a central enterprise service bus, using Kafka technology, will enable an

asynchronous exchange and event-based data communication. Instead of numerous bilateral

data connections, only one normalized interface is needed on the producer’s side.

on the side of the producer.

This will transform DB Regio AG’s complex and distributed IT environment into a state-of-the-art (enterprise) IT architecture.

architecture.

Tasks:

? Programming with Java, Gitlab, Ci/CD Pipeline and AWS ECS

? Developing infrastructure as code (IaC) using Cloudformation and Terraform.

? Design and development of Apache Kafka connectors based on cloud technologies.

? Design and development of Kafka Producer and Consumer in Java Maven/Gradle between applications

and a central event broker

? Transformation between design and development for implementation of prototypes and consulting on

technical aspects of the implementation

? Creation and handling of Docker images

? Development of different AWS services (ESC, S3, EC2, IAM, CloudWatch…).

Must-Requirements:

– Five+ years of AWSCloud DevOps experience using Gitlab CI/CD pipeline in Java (using.

of Maven/ Gradle)

– Very good knowledge of developing and implementing data integration layers through APIs and

scalable interfaces to connect different applications and service for and with cloud environments

(Focus: AWS) and cloud services (esp. ECS, CloudWatch).

– Very good knowledge with cloud-based container services

– Very good knowledge with infrastructure as code (AWS CloudFormation/Terraform)

– Very good experience in building an event driven architecture and dealing with near/ real-time

DataStreams

Target Requirements:

– Experience in agile software development, also in large development environments (e.g. following SCRUM or SAFe).

– Experience (ideally at least 3 years) in the development of complex enterprise applications

– Very good knowledge of Apache Kafka and AWS MSK as well as desirable experience with AWS Glue, Lambda, Kinesis

and AWS IoT

– Good understanding of networking principles and protocols including VPC configuration, routing and

Network security concepts within AWS

Lade deinen CV/Lebenslauf oder eine andere relevante Datei hoch. Maximale Dateigröße: 256 MB.


Du kannst dich mit deinem Online-Lebenslauf für diesen Job und andere bewerben. Klicke auf den Link unten, um deinen Online-Lebenslauf zu übermitteln und deine Bewerbung per E-Mail an diesen Arbeitgeber zu senden.

About ZeilenJOB

Portal für Remote Jobs