Backend Developer C# - Data Analytics Platform (f/m/d) - Cologne, Paderborn or remote

  • DeepL GmbH
  • Home office von Nadine Capoen, Karl-Marx-Straße, Birkenwerder, Germany
  • 30/03/2021
Full time Data Science Data Engineering Data Analytics Big Data Data Management Statistics

Job Description

is Germany's best-known AI company. We develop neural networks to help people work with language. With DeepL Translator, we have created the world's best machine translation system and made it available free of charge to everyone online. Over the next few years, we aim to make DeepL the world's leading language technology company.

Our goal is to overcome language barriers and bring cultures closer together.

What distinguishes us from other companies?

DeepL (formerly Linguee) was founded by developers and researchers. We focus on the development of new, exciting products, which is why we spend a lot of time actively researching the latest topics. We understand the challenges of developing new products and try to meet them with an agile and dynamic way of working. Our work culture is very open because we want our employees to feel comfortable. In our daily work we use modern technologies - not only to translate texts, but also to create the world's best dictionaries, and solve other language problems.

When we tell people about DeepL as an employer, reactions are overwhelmingly positive. Maybe it's because they have enjoyed our services, or maybe they just want to get on board with our quest to break down language barriers and facilitate communication.


Your choice
We are constantly looking for outstanding employees! Currently we offer remote work in Germany, the Netherlands, the UK and Poland. Whether you would like to work from home in one of these countries or from one of our offices in Cologne or Paderborn: the choice is yours. No matter where you choose to work from, our way of working is designed to make you an essential part of the team.

What will you be doing at DeepL?

We are looking for an experienced software developer to help build and improve our in-house Data Platform. The Data Platform combines different data sources, both internal and external, and makes them available to our stakeholders company-wide: Developers, Product Development, Analytics and Management. You will work in a cross-functional team with product managers, data analysts and data engineers to make the future at DeepL bright and data-driven.

Your responsibilities

  • Create services in C# and Python for our data pipelines
  • Write internal tooling, and software for our streaming pipelines
  • Manage automation and deployment using Gitlab CI, ArgoCD and Kubernetes
  • Work with other teams to define and improve our APIs
  • Contribute to architecture and technology decisions and constantly challenge our ideas
  • Last but not least, you are not afraid of trying new technologies and ways even if StackOverflow says it can't be done

What we offer

  • The opportunity to work on an in-house analytics and experimentation platform - far beyond the limitations of standard web analytics platforms - which handles data at scale from products used by more than 100 million people worldwide
  • Design and build state-of-the-art APIs and tools and work with modern technologies
  • Interesting challenges: work as part of a highly motivated team that values pragmatic and elegant
    solutions and develops efficient software for data at scale
  • A friendly, international, and highly committed team with a lot of trust and very short decision making processes
  • Meaningful work: We break down language barriers worldwide and bring different cultures closer together
  • A comfortable office in Cologne or Paderborn (or suitable equipment for your home office) and a lot of flexibility

About you

  • 2+ years of industry experience as a software developer
  • A university degree in computer science, information systems or a similar technical field or equivalent experience
  • Practical experience in C# and ASP.NET Core; experience in Python desirable
  • A solid understanding of SQL
  • You know your way around Linux and can debug a failing system
  • You have worked with Docker containers and automated their creation
  • Familiar with the processing of unstructured and structured data at scale
  • Experience with the following technologies are a plus: Apache Kafka, ClickHouse, Kubernetes, Gitlab CI
  • Hands-on experience with event streaming is also a plus
  • Good communication and team player skills
  • Fluent in English; German is nice to have