Alumni
2014 - 2016 | Post doc. at the Section for Cognitive Systems at the Technical University of Denmark. |
2012 - 2014 | Post doc. at the Perceiving Systems department at the Max Planck Institute for Intelligent Systems. |
2011 - 2012 | Post doc. at the department of computer science at the University of Copenhagen. |
2008 - 2011 | Phd in Computer Science from the University of Copenhagen. |
2010 | Visiting Scholar at CITRIS, UC Berkeley. |
2008 | Consultant in software and statistics at Dralle A/S. |
2007 - 2008 | Research assistant at the department of computer science at the University of Copenhagen. |
2000 - 2007 | Bachelor and Master's degree in Computer Science from the University of Copenhagen. |
1994 - 2007 | Various employments (dishwasher, secretary, toy salesman, teaching assistant, etc.) |
2013 | The Danish Council for Independent Research has kindly given me a Sapere Aude: DFF- Research Talent award. |
2013 | The Danish Council for Independent Research (Natural Sciences) will fund my post doc position at the Technical University of Denmark. |
2013 | Amazon.com has kindly donated access to their elastic cloud compute servers through an AWS in Education Machine Learning Research Grant award. |
2011 | The Villum Foundation fully funds my post doc at the Max Planck Institute for Intelligent Systems. I was also offered a stipend from MPI for this position. |
2010 | The Danish Agency for Science and Technology and Innovation covered travel expenses and tuition for my visiting scholarship at Ruzena Bajcsy's group at CITRIS, UC Berkeley. |
2008 | My PhD was fully funded by a scholarship from the Faculty of Science, University of Copenhagen. |
2007 | My Master's thesis was supported by a grant from the Body and Mind research priority at the University of Copenhagen. |
June 25, 2011 | Parts of my PhD work was covered in the engineering news magazine Ingeniøren. |
May 25, 2011 | Parts of my PhD work was presented on Danish national radio in Harddisken. |
Programmer | As an extension to my PhD work, I have handled daily management of a research programmer employed to create demonstration software for physical rehabilitation. |
Student Supervision |
I am currently co-supervising Michael Schober with Philipp Hennig. During my PhD studies, I have supervised 7 student projects including 2 master's theses. This resulted in two publications. |
Open Source | As part of my contributions to Free and Open Source software development I have been leading the Octave-Forge project from 2006 to 2011. The project consists of 70+ independent developers. |
Organization | From 2012 to 2014 I organized the Intelligent Systems Colloquium talk series in Tübingen |
In general, I try to make software as available as is practical (sometimes the burdens of maintaining software in public outweighs the benifits). Below are some stuff that is currently available. If you know I have software which could be useful to you, but is not available here, just send me an e-mail and I'll do my best to help you out.
Most of the software used in our papers in articulated tracking / pose estimate / whatever-you-call-it is available at http://humim.org.
The software developed for the NIPS 2012 paper used for regression and dimensionality reduction using multiple metrics can be found at the project website.
Cholesky-like decomposition of positive semidefinite matrices: when sampling from a Gaussian with covariance matrix S the best solution is generally to perform a Cholesky decomposition S = RTR and multiply isotropic Gaussian samples with R. Sadly, this fails if S is positive semidefinite, which is very often the case. The standard solution is to perform an eigen value decomposition of S, but that can be computationally quite demanding in high dimensions. I put together a simple function that solves this problem reasonably well by first trying a Cholesky decomposition and, if that fails, then a LDL decomposition.
[Matlab code]
I am no longer a post doc at the Max Planck Institute, so this web page is not updated any more. My current employement is at DTU Compute: new web site.
Model what you can; learn the rest. A simple statement that summarizes my approach to computer vision, machine learning and science in general. I believe that we should always try to incorporate as much known information in our models before learning unknown parameters -- even if that means we have to derive new models from scratch! In practice, my work is mostly concerned with (but not limited to) the following topics:
Most naturally occuring phenomena are complex enough to necessitate machine learning as an integral part of building models of said phenomena. However, with machine learning comes a need for large amounts of data, which may be hard to acquire, e.g. due to price and time constraints or simply because the phenomenon is rare in nature. In such cases we rely on expert knowledge to guide the learning scheme. Sadly, our tools for incorporating such knowledge are not very strong: we often resort to add hoc regularization techniques or seek indirect sources of information such as data labels.
I believe we can do better!
Fundamentally: the more we know, the less we have to learn from data. Often experts can provide more direct pieces of information about the phenomenon, e.g. that the solution has to satisfy a certain set of constraints or that a specific distance measure is to be prefered over the one naturally implied by the vector representation of the data (assuming such a representation even exist). Sadly, most machine learning techniques are incapable of incorporating these clues in a principled way. This often forces practitioners to ignore the expert knowledge which increases the need for data as now the machine learning technique also has to learn what the expert already knew.
My research revolves around the idea that we should incorporate as much expert knowledge as possible and only attempt to learn what we do not already know. As expert knowledge is most often not linear, we are forced away from Euclidean models. This removes one of the most fundamental assumptions behind modern statistical tools and we need to create new ones.
My current work is centred around Riemannian geometry as I find that to be a natural and practical way of incorporating further expert knowledge. Still, there are many problems which cannot be described in this setting...
ps
sf
Hauberg, S., Feragen, A., Enficiaud, R., Black, M.
Scalable Robust Principal Component Analysis using Grassmann Averages
IEEE Trans. Pattern Analysis and Machine Intelligence (PAMI), December 2015 (article)
ps
Freifeld, O., Hauberg, S., Black, M. J.
Model Transport: Towards Scalable Transfer Learning on Manifolds
In Proceedings IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), pages: 1378 -1385, Columbus, Ohio, USA, IEEE Intenational Conference on Computer Vision and Pattern Recognition, June 2014 (inproceedings)
ps
Freifeld, O., Hauberg, S., Black, M. J.
Model transport: towards scalable transfer learning on manifolds - supplemental material
(9), April 2014 (techreport)
ei
ps
pn
Hennig, P., Hauberg, S.
Probabilistic Solutions to Differential Equations and their Application to Riemannian Statistics
In Proceedings of the 17th International Conference on Artificial Intelligence and Statistics, 33, pages: 347-355, JMLR: Workshop and Conference Proceedings, (Editors: S Kaski and J Corander), Microtome Publishing, Brookline, MA, AISTATS, April 2014 (inproceedings)
ei
pn
Schober, M., Kasenburg, N., Feragen, A., Hennig, P., Hauberg, S.
Probabilistic Shortest Path Tractography in DTI Using Gaussian Process ODE Solvers
In Medical Image Computing and Computer-Assisted Intervention – MICCAI 2014, Lecture Notes in Computer Science Vol. 8675, pages: 265-272, (Editors: P. Golland, N. Hata, C. Barillot, J. Hornegger and R. Howe), Springer, Heidelberg, MICCAI, 2014 (inproceedings)