Data Engineer - Data Analytics Platform (f/m/d) - Cologne, Paderborn or remote

  • DeepL GmbH
  • Homeoffice-Coaching, Tacitusstraße, Cologne, Germany
  • 17/02/2021
Full time Data Science Data Engineering Data Analytics Big Data Data Management Statistics

Job Description

is Germany's best-known AI company. We develop neural networks to help people work with language. With DeepL Translator, we have created the world's best machine translation system and made it available free of charge to everyone online. Over the next few years, we aim to make DeepL the world's leading language technology company.

Our goal is to overcome language barriers and bring cultures closer together.

What distinguishes us from other companies?

DeepL (formerly Linguee) was founded by developers and researchers. We focus on the development of new, exciting products, which is why we spend a lot of time actively researching the latest topics. We understand the challenges of developing new products and try to meet them with an agile and dynamic way of working. Our work culture is very open because we want our employees to feel comfortable. In our daily work we use modern technologies - not only to translate texts, but also to create the world's best dictionaries, and solve other language problems.

When we tell people about DeepL as an employer, reactions are overwhelmingly positive. Maybe it's because they have enjoyed our services, or maybe they just want to get on board with our quest to break down language barriers and facilitate communication.

Your choice
We are constantly looking for outstanding employees! Currently we offer remote work in Germany, the Netherlands, the UK and Poland. Whether you would like to work from home in one of these countries or from one of our offices in Cologne or Paderborn: the choice is yours. No matter where you choose to work from, our way of working is designed to make you an essential part of the team.

What will you be doing at DeepL?

We are looking for an experienced data engineer to help build and improve our in-house Data Platform. The Data Platform combines different data sources, both internal and external, and makes them available to our stakeholders company-wide: Developers, Product Development, Analytics and Management. You will work in a cross-functional team with product managers, data analysts, data engineers and developers to make the future at DeepL bright and data-driven.

Your responsibilities

  • Improve and maintain our data streaming pipelines
  • Monitor and maintain the entire data platform and know the levers you need to pull in certain situations
  • Build out our data infrastructure and manage dependencies between data pipelines
  • Implement new microservices and software for internal tooling, ETL jobs and our streaming pipelines
  • Deploy said applications in production and take care of it and its dependencies
  • Define new requirements and constantly challenge our ideas
  • Last but not least, you are not afraid of trying new technologies and ways even if StackOverflow says it can't be done

What we offer

  • Data at scale from products used by more than 100 million people worldwide
  • Our own analytics and experimentation platform - far beyond the limitations of standard web analytics platforms
  • The chance to work and play with state of the art technologies like Clickhouse, Kubernetes and Kafka
  • Interesting challenges: design and programming at the highest level
  • A friendly, international, and highly committed team with a lot of trust and very short decision making processes
  • Meaningful work: We break down language barriers worldwide and bring different cultures closer together
  • A comfortable office in Cologne or Paderborn (or suitable equipment for your home office) and a lot of flexibility

About you

  • 2+ years of industry experience as a data engineer
  • A university degree in computer science, information systems or a similar technical field or a similar qualification
  • Expert in Python and basic knowledge in a compiled language (like C#, C++, Java)
  • A solid understanding of SQL
  • Hands-on experience with event streaming and common technologies like Apache Kafka
  • You know your way around Linux and can debug a failing system
  • Familiar with the processing of unstructured and structured data at scale
  • You have worked with containers (like Docker) and automated their creation
  • Good communication and team player skills
  • Fluent in English, German is a plus

We are looking forward to your application!