Azure Data/DevOps Engineer (m/f/x)

  • Deutsche Post & DHL
  • Bonn, Germany
  • 07/07/2022
Full time Data Science Data Engineering Data Analytics Big Data Data Management Statistics

Job Description

Vollständige Stellenbeschreibung

BONN/ DARMSTADT, PERMANENT AND FULL TIME

About DHL IT Services and the AI & Analytics team

At DHL IT Services we design, build and run IT solutions for the whole of DPDHL globally. The AI & Analytics team builds and runs solutions to get much more value out of our data. We help our business colleagues all over the world with machine learning algorithms, predictive models and visualizations. We manage more than 50 AI & Big Data applications, with 3.000 active users in 87 countries, and up to 100,000,000 daily transactions.
Integration of AI & Big Data into business processes to compete in a data-driven world needs state of the art technology. Our infrastructure, hosted on-premise and in the cloud (Azure and GCP), includes MapR, Airflow, Spark, Kafka, jupyter, Kubeflow, Jenkins, GitHub, Tableau, Power BI, Synapse (Analytics), Databricks and other exciting tools.

We like to do everything in an Agile/DevOps way. No more throwing the “problem code” to support, and no silos. Our teams are completely product-oriented, having end to end responsibility for the success of our products.

Who are we looking for?
Currently, we are looking for an Azure Data/DevOps Engineer (m/f/x) with strong Data Engineering skillset and experience. In this role, you will have the opportunity to design and develop solutions on our Azure Spokes. We are looking for someone to help us manage the petabytes of data we have and turn them into something valuable.

Does that sound a bit like you? Let’s talk! Even if you don’t tick all the boxes below, we’d love to hear from you. Our new department is rapidly growing and we’re looking for people with a can-do mindset to join us on our digitalization journey. Thank you for considering DHL as the next step in your career – we really do believe we can make a difference together!

What will you need?

  • University Degree in Computer Science, Information Systems, Business Administration, or related field
  • Azure Data stack experience: Spark/Databricks, ADLS Gen2/Blob, EventHub/Kafka, ADE, SQL, Azure Synapse/SQL DWH, Python/Scala programming, Azure functions
o Develop central ETL routines + stored procedures (if needed)
o Develop logical domain model in capture layer
o Develop logical data models in serve layer
o Admin + Maintain (run) central data platform on Azure Cloud + ensure data consistency
o Use case specific data pipeline development if needed
  • Azure DevOps/terraform experience
o Maintain/Configure Azure DevOps CI/CD Pipelines, Repositories & Artifactories
o Develop Build/Deploy (Pipeline) + Terraform (IaC) Scripts & Tools (Automation) based on CorpDL templates
o Design & Implement Unit & Integration Testing for pipelines & IaC deployment
  • Programming experience:
o scala or python

You should have:

  • Certificates in some of the core technologies
Language requirements:

  • English – Fluent spoken and written (C1 level)

Your benefits:

As an employer, we are offering you excellent social benefits, competitive salary structures and relevant development opportunities. In addition, you can benefit from our flexible working time with compensatory time off and the possibility to work from home. Furthermore, we are offering free parking spaces, a job ticket for our employees in Bonn as well as other benefits via our employee portal.

Your contact:

For further information about the position please contact Arne Wieczorrek, phone +49 228 18927327.

#digitalplatforms