Student assistant research work
We seek students to join our team for 40, 60, or 80 monthly working hours for teaching and/or research roles.
Please review the main job opportunities page for general information.
Tasks and current projects
Research
Tasks
General tasks to be carried out under guided supervision:
- Code implementation of existing and novel machine learning models (such as data preparation and model training on GPU cluster) as well as documentation and presentation of code and experimental results to supervising scientists
- Analysis and visualization of data
- Reviewing existing machine learning literature
- Provide input on conceptual ideas and assist in writing scientific papers
- Other support for the department (max. 20 %): Assistance in fulfilling teaching obligations by helping to prepare our courses (e.g. “Machine Learning”, “Cognitive Algorithms” or “Python for Machine Learning”) as well as support with exams and the preparation of exercise and course materials. If you’d prefer a greater focus on this type of work, let us know, and we’ll review our vacancies for suitable roles.
All projects require the general skills listed above, particularly the ability to understand research and implement it in code.
Exemplary research areas
- Probabilistic machine learning
- Developing novel probabilistic models with efficient inference methods
- Exploring novel applications of probabilistic models
- Establishing uncertainty estimation methods for deep probabilistic models
- Exemplary models: generative methods such as diffusion- and score-based models
- Exemplary application areas: natural sciences (physics/chemistry); computer vision; natural language processing
- Quantum chemistry
- Machine Learning for Molecular Simulations: Developing ML methods to tackle many-body problems in quantum chemistry, where interactions scale rapidly with system size.
- Computational & Statistical Integration: Combining computational physics principles with statistical modeling to enhance the understanding of quantum phenomena.
- Data-Driven Insights: Leveraging data-driven approaches to reframe complex questions and offer new perspectives on established problems.
- Explainable machine learning
- Advancing Explainable AI (XAI) for Deep Learning: Developing foundational algorithms to bridge the gap between existing XAI methods and practical needs, enhancing trustworthiness and autonomy in machine learning models.
- XAI for Real-World Applications: Investigating how XAI can assess model reliability for autonomous decision-making and how it can help identify actionable components in complex real-world systems.
- Pathology
- Develop machine learning methods for medical image analysis, e.g., using decoder-free architectures to improve the classification of diseases such as cancer.
- For Biomedical Engineering in the IBS group, please contact the group lead Alexander von Lühmann directly.
The above list is not exhaustive.
Please review our research profile, publications and team for more information.
You do not necessarily need to have a domain preference when applying.
Furthermore, you usually do not have to be an expert in the above listed domain.
It is usually more important to have strong foundational skills, e.g., in programming.