MASTER THESIS MACHINE LEARNING

  • BrainLab
  • Munich, Germany
  • 31/12/2018
Full time Data Science Data Engineering Machine Learning Data Analytics Artificial Intelligence Statistics Software Engineering

Job Description

Machine learning algorithms have become highly popular and successful for the segmentation of medical image data, including 3D anatomical data as produced by MRI and CT scanners. Nevertheless, the more traditional atlas-based methods still play an important role in many practical applications of this field and offer some compelling advantages. For a research project on combining the best of both worlds, we are looking for a highly-motivated master student to support our Artificial Intelligence Team.

The thesis will build upon recent advances in machine learning-based deformable image registration and investigate the potential of such approaches in the context of a real medical segmentation product. The student will work with the latest deep learning methods and extend the current state-of-the-art in order to handle a number of challenges that appear in real-world applications (e.g. handling variability in anatomy, pathologies, multi-modal data, and more). The developed algorithms will be benchmarked against previous approaches using data of high practical relevance.

WHAT YOU OFFER

  • Currently studying Mathematics, Physics, Informatics or similar with a focus on machine learning
  • Practical experience in building deep learning models, ideally with Tensorflow
  • Good skills in Python
  • Knowledge about medical image processing is an advantage

WHAT WE OFFER

  • Working in a highly motivated team at the headquarters of an exciting international high-tech company
  • A professional and performance-oriented business environment
  • Weekly student lunches and monthly After Work Events
  • An award-winning company restaurant
  • An in-house, state-of-the-art 360m² fitness studio

Interested? Then we are looking forward to receiving your online application directly through our website!

Contact person: Veronika Hofmann