Senior Data Engineer für Datapipelines (Python, Airflow, Shell, DataLake, Kubernetes, OpenShift)(mwd) RemoteBerlin

Posted

8/28/2024

Role

Data Engineer

Location

Unknown

Remote

Discuss with client

Project Description

We are looking for a Senior Data Engineer with expertise in Datapipelines using Python, Airflow, Shell scripting, DataLake, Kubernetes, and OpenShift for a remote project based in Berlin. The ideal candidate should have a strong background in PL/SQL, Python, Airflow, Bash/Shell scripting, data modeling, and software development, with at least 6 months of experience in agile methodology and public sector projects. Knowledge of Batch/Lambda/Kappa architectures, Kubernetes, Red Hat OpenShift, Data Lake development, R programming, Machine Learning Pipelines, Argo Workflows, CI/CD pipelines, and tools like GitLab, Artifactory, Confluence, and Jira is also required. Good communication skills in both German and English are essential. The role is predominantly remote with minimal on-site presence required for onboarding/offboarding and occasional meetings. If you meet these qualifications and are available to start on September 16, 2024, please apply now.

Skills

AirflowPythonOpenshiftData LakeKubernetesAmazon S3ConfluenceJiraBash ShellBeratungContinuous IntegrationData ArchitectureDatenmodellVertriebMaschinenScrumSoftwareentwicklungPl/SqlWorkflowsOffboardingGitlabMachine Learning OperationsArtifactory
Go to Project Page