I can be reached at:
Building BSB, room 2⁄001
The icons below should cover everything, but in case you prefer text: a current academic curriculum vitæ is available. My ORCID is 0000-0003-4335-0302. I also have a Google Scholar profile, a ResearchGate profile, a publons profile, and a GitHub account. Sometimes, I tweet things.
- 2019-11-01: Our work Topological Autoencoders has been accepted for an oral presentation at the Swiss Machine Learning Day 2019 (SMLD).
- 2019-09-27: I am truly honoured to serve as a mentor for the New in ML 2019 Workshop among so many of my role models.
- 2019-09-16: The slides for my keynote talk at the Applications of Topological Data Analysis Workshop are now available.
- 2019-09-04: I am delighted to receive an outstanding reviewer award for the ECMLPKDD 2019 Journal Track.
- 2019-09-03: Our work Wasserstein Weisfeiler–Lehman Graph Kernels has been accepted at NeurIPS 2019 with a spotlight presentation.
- 2019-09-03: I am honoured to receive a free NeurIPS 2019 registration for being an outstanding reviewer.
- 2019-08-09: Our work A Wasserstein Subsequence Kernel for Time Series has been accepted at the IEEE International Conference on Data Mining (ICDM).
- 2019-08-01: Four new preprints have been added. They will become chapters of the upcoming book Topological Methods in Data Analysis and Visualization V.
- 2019-07-06: Our work Early Recognition of Sepsis with Gaussian Process Temporal Convolutional Networks and Dynamic Time Warping has been accepted for a poster presentation and a spotlight talk at the Machine Learning for Healthcare conference (MLHC).
- 2019-05-20: I am honoured to have been invited to give a keynote talk at the Applications of Topological Data Analysis Workshop, co-located with ECMLPKDD 2019. Stay tuned for more information.
- 2019-04-21: Our work A Persistent Weisfeiler–Lehman Procedure for Graph Classification has been accepted for a poster presentation and a short talk at the International Conference for Machine Learning (ICML).
- 2018-12-18: Our work Neural Persistence: A Complexity Measure for Deep Neural Networks Using Algebraic Topology has been accepted for a poster presentation at the International Conference for Learning Representations (ICLR).