I am a researcher in Applied Mathematics looking for a research assistant to work on an interesting problem in the field of deep learning, with a focus on neural ordinary differential equations ODEs. A neural ODE defines its output as the solution to an ODE parameterised by a neural network. Due to this formulation, the gradients during the training of neural ODEs are computed differently to a standard neural network. How can we characterise gradient dynamics during the training of neural ODEs, such that it will illicit regularisation methods for more effective training?

To start, you will implement a number of neural ODE baselines in a common framework (PyTorch or JAX) and investigate the loss Hessian of the neural ODE, with the goal of characterising gradient dynamics.


  • Knowledge of deep neural networks and a basic understanding of numerical integration.
  • Proficient in Python and familiar with PyTorch or JAX.
  • Current Stellenbosch University student.


  • Regular stand-up meetings to discuss ideas and progress.
  • Implementation of baselines and Hessian-based analysis, with appropriate documentation.
  • Applicant must have their own computer and a stable internet connection. Access to remote computational resources will be provided.

Remuneration: R20 000 for 6 weeks over the Nov 2022 to Feb 2023 period, with some flexibility on the dates. You will be added as an author if contributions lead to publication.

Application procedure: Please send a short CV and your academic transcript to Shane Josias (josias@sun.ac.za) by 31 October 2022. Shortlisted candidates will be invited for a quick discussion by 04 November 2022.