Overview
Description
Your services:
? Further development and maintenance of a Python application, consisting of the modules Data Engineering-
Pipeline (ETL) and Web-Backend (API)
? Consulting in the topics of the Data Engineers in the further development of the ETL pipeline (AWS Step
Functions, AWS Batch, AWS Lambda, AWS S3, PostgreSQL)
? Deployment and operation of the application in a modern cloud architecture (Docker, Kubernetes, GitLab-
CI, ArgoCD)
? Ensure operational management of the application using monitoring tools (Prometheus, Grafana,
Loki, AlertManager)
? Develop and execute appropriate test procedures to ensure the quality of the code
? Create technical interface agreements and coordinate with stakeholders
? Automate and monitor data connections of external systems
Must-Requirements:
– Sound knowledge of software development (e.g. Python, C++, Java, Golang).
– Experience in developing automated tests (unit, integration and regression tests)
– Sound knowledge of containerization (e.g., Docker, Kubernetes, Helm)
– Sound knowledge in setting up infrastructure according to IaC (e.g. Terraform, Ansible, Gitlab, ArgoCD)
– Sound knowledge in monitoring applications (e.g. Prometheus, fluentd, Grafana).
Target Requirements:
– AWS knowledge, especially with AWS S3, AWS Lambda, AWS RDS, AWS Step Functions, AWS Batch desired.
– Experience in database design for Big Data applications
– Experience in the development of ETL pipelines
– Ideally, initial hands-on experience with an application using an end-to-end DevOps approach
– Recent experience in cross-industry, international data and AI projects for digital
Transformation (e.g. from mobility, logistics and production)
– Experience in agile working
About ZeilenJOB
Portal für Remote Jobs