# Publications

## 2023

B. Boll,

DOI: 10.1002/pamm.202200169

**A. Zeilmann**, S. Petra, C. Schnörr: Self-Certifying Classification by Linearized Deep Assignment. PAMM, May 2023DOI: 10.1002/pamm.202200169

We propose a novel class of deep stochastic predictors for classifying metric data on graphs within the PAC-Bayes risk certification paradigm. Classifiers are realized as linearly parametrized deep assignment flows with random initial conditions. Building on the recent PAC-Bayes literature and data-dependent priors, this approach enables (i) to use risk bounds as training objectives for learning posterior distributions on the hypothesis space and (ii) to compute tight out-of-sample risk certificates of randomized classifiers more efficiently than related work. Comparison with empirical test set errors illustrates the performance and practicality of this self-certifying classification method.

@article{Boll2023Self, author = {Boll, Bastian and Zeilmann, Alexander and Petra, Stefania and Schn{\" o}rr, Christoph}, journal = {PAMM}, number = {1}, year = {2023}, month = {may 31}, publisher = {Wiley}, title = {Self-{Certifying} {Classification} by {Linearized} {Deep} {Assignment}}, volume = {23}, }

@article{Boll2023Self, abstract = {We propose a novel class of deep stochastic predictors for classifying metric data on graphs within the PAC-Bayes risk certification paradigm. Classifiers are realized as linearly parametrized deep assignment flows with random initial conditions. Building on the recent PAC-Bayes literature and data-dependent priors, this approach enables (i) to use risk bounds as training objectives for learning posterior distributions on the hypothesis space and (ii) to compute tight out-of-sample risk certificates of randomized classifiers more efficiently than related work. Comparison with empirical test set errors illustrates the performance and practicality of this self-certifying classification method.}, author = {Boll, Bastian and Zeilmann, Alexander and Petra, Stefania and Schn{\" o}rr, Christoph}, journaltitle = {PAMM}, shortjournal = {Proc Appl Math & Mech}, doi = {10.1002/pamm.202200169}, issn = {1617-7061}, number = {1}, date = {2023-05-31}, language = {en}, publisher = {Wiley}, title = {Self-{Certifying} {Classification} by {Linearized} {Deep} {Assignment}}, url = {http://dx.doi.org/10.1002/pamm.202200169}, volume = {23}, }

A. Ahadova, A. Stenzinger, T. Seppälä, R. Hüneburg, M. Kloor, H. Bläker, J. Wittemann, V. Endris, L. Gerling, V. Bertram, M. Neumuth, J. Witt, S. Graf, G. Kristiansen, O. Hommerding, S. Haupt,

DOI: 10.1053/j.gastro.2023.03.007

**A. Zeilmann**, V. Heuveline, D. Kazdal, J. Gebert, M. von Knebel Doeberitz, J. Mecklin, J. Nattermann: A “two-in-one hit” model of shortcut carcinogenesis in MLH1 Lynch syndrome carriers. Gastroenterology, March 2023DOI: 10.1053/j.gastro.2023.03.007

The clinical presentation of Lynch syndrome (LS), the most common inherited cancer syndrome, varies depending on the affected mismatch repair (MMR) gene. Although both MLH1 and MSH2 carriers have a high risk of colorectal cancer (CRC) under regular surveillance, MSH2 carriers have a significantly higher risk of developing adenomas compared with MLH1 carriers. A CRC pathway presumably skipping the adenoma phase has been previously associated with somatic CTNNB1 mutations. About 50% of MLH1-associated CRCs display CTNNB1 mutations, whereas MSH2-associated CRCs are rarely CTNNB1 mutant and more commonly show somatic APC mutations.

@article{Ahadova2023two, author = {Ahadova, Aysel and Stenzinger, Albrecht and Sepp{\" a}l{\" a}, Toni and H{\" u}neburg, Robert and Kloor, Matthias and Bl{\" a}ker, Hendrik and Wittemann, Jan-Niklas and Endris, Volker and Gerling, Leonie and Bertram, Veit and Neumuth, Marie Theres and Witt, Johannes and Graf, Sebastian and Kristiansen, Glen and Hommerding, Oliver and Haupt, Saskia and Zeilmann, Alexander and Heuveline, Vincent and Kazdal, Daniel and Gebert, Johannes and von Knebel Doeberitz, Magnus and Mecklin, Jukka-Pekka and Nattermann, Jacob}, journal = {Gastroenterology}, year = {2023}, month = {mar 10}, publisher = {Elsevier BV}, title = {A ``two-in-one hit'' model of shortcut carcinogenesis in {MLH1} {Lynch} syndrome carriers}, }

@article{Ahadova2023two, abstract = {The clinical presentation of Lynch syndrome (LS), the most common inherited cancer syndrome, varies depending on the affected mismatch repair (MMR) gene. Although both MLH1 and MSH2 carriers have a high risk of colorectal cancer (CRC) under regular surveillance, MSH2 carriers have a significantly higher risk of developing adenomas compared with MLH1 carriers. A CRC pathway presumably skipping the adenoma phase has been previously associated with somatic CTNNB1 mutations. About 50% of MLH1-associated CRCs display CTNNB1 mutations, whereas MSH2-associated CRCs are rarely CTNNB1 mutant and more commonly show somatic APC mutations.}, author = {Ahadova, Aysel and Stenzinger, Albrecht and Sepp{\" a}l{\" a}, Toni and H{\" u}neburg, Robert and Kloor, Matthias and Bl{\" a}ker, Hendrik and Wittemann, Jan-Niklas and Endris, Volker and Gerling, Leonie and Bertram, Veit and Neumuth, Marie Theres and Witt, Johannes and Graf, Sebastian and Kristiansen, Glen and Hommerding, Oliver and Haupt, Saskia and Zeilmann, Alexander and Heuveline, Vincent and Kazdal, Daniel and Gebert, Johannes and von Knebel Doeberitz, Magnus and Mecklin, Jukka-Pekka and Nattermann, Jacob}, journaltitle = {Gastroenterology}, shortjournal = {Gastroenterology}, doi = {10.1053/j.gastro.2023.03.007}, issn = {0016-5085}, date = {2023-03-10}, language = {en}, publisher = {Elsevier BV}, title = {A ``two-in-one hit'' model of shortcut carcinogenesis in {MLH1} {Lynch} syndrome carriers}, url = {http://dx.doi.org/10.1053/j.gastro.2023.03.007}, }

**A. Zeilmann**, S. Petra, C. Schnörr: Learning Linearized Assignment Flows for Image Labeling. Journal of Mathematical Imaging and Vision, January 2023

DOI: 10.1007/s10851-022-01132-9

We introduce a novel algorithm for estimating optimal parameters of linearized assignment flows for image labeling. An exact formula is derived for the parameter gradient of any loss function that is constrained by the linear system of ODEs determining the linearized assignment flow. We show how to efficiently evaluate this formula using a Krylov subspace and a low-rank approximation. This enables us to perform parameter learning by Riemannian gradient descent in the parameter space, without the need to backpropagate errors or to solve an adjoint equation. Experiments demonstrate that our method performs as good as highly-tuned machine learning software using automatic differentiation. Unlike methods employing automatic differentiation, our approach yields a low-dimensional representation of internal parameters and their dynamics which helps to understand how assignment flows and more generally neural networks work and perform.

@article{Zeilmann2023Learning, author = {Zeilmann, Alexander and Petra, Stefania and Schn{\" o}rr, Christoph}, journal = {Journal of Mathematical Imaging and Vision}, number = {1}, year = {2023}, month = {jan 17}, pages = {164--184}, publisher = {{Springer Science and Business Media LLC}}, title = {Learning {Linearized} {Assignment} {Flows} for {Image} {Labeling}}, volume = {65}, }

@article{Zeilmann2023Learning, abstract = {We introduce a novel algorithm for estimating optimal parameters of linearized assignment flows for image labeling. An exact formula is derived for the parameter gradient of any loss function that is constrained by the linear system of ODEs determining the linearized assignment flow. We show how to efficiently evaluate this formula using a Krylov subspace and a low-rank approximation. This enables us to perform parameter learning by Riemannian gradient descent in the parameter space, without the need to backpropagate errors or to solve an adjoint equation. Experiments demonstrate that our method performs as good as highly-tuned machine learning software using automatic differentiation. Unlike methods employing automatic differentiation, our approach yields a low-dimensional representation of internal parameters and their dynamics which helps to understand how assignment flows and more generally neural networks work and perform.}, author = {Zeilmann, Alexander and Petra, Stefania and Schn{\" o}rr, Christoph}, journaltitle = {Journal of Mathematical Imaging and Vision}, shortjournal = {J Math Imaging Vis}, doi = {10.1007/s10851-022-01132-9}, issn = {0924-9907}, number = {1}, date = {2023-01-17}, language = {en}, pages = {164--184}, publisher = {{Springer Science and Business Media LLC}}, title = {Learning {Linearized} {Assignment} {Flows} for {Image} {Labeling}}, url = {http://dx.doi.org/10.1007/s10851-022-01132-9}, volume = {65}, }

## 2022

B. Boll,

**A. Zeilmann**, S. Petra, C. Schnörr: Self-Certifying Classification by Linearized Deep Assignment. arXiv, January 2022. PreprintWe propose a novel class of deep stochastic predictors for classifying metric data on graphs within the PAC-Bayes risk certification paradigm. Classifiers are realized as linearly parametrized deep assignment flows with random initial conditions. Building on the recent PAC-Bayes literature and data-dependent priors, this approach enables (i) to use risk bounds as training objectives for learning posterior distributions on the hypothesis space and (ii) to compute tight out-of-sample risk certificates of randomized classifiers more efficiently than related work. Comparison with empirical test set errors illustrates the performance and practicality of this self-certifying classification method.

@article{Boll2022Self, author = {Boll, Bastian and Zeilmann, Alexander and Petra, Stefania and Schn{\" o}rr, Christoph}, journal = {arXiv}, year = {2022}, month = {jan 28}, title = {Self-{Certifying} {Classification} by {Linearized} {Deep} {Assignment}}, howpublished = {https://arxiv.org/abs/2201.11162}, }

@article{Boll2022Self, abstract = {We propose a novel class of deep stochastic predictors for classifying metric data on graphs within the PAC-Bayes risk certification paradigm. Classifiers are realized as linearly parametrized deep assignment flows with random initial conditions. Building on the recent PAC-Bayes literature and data-dependent priors, this approach enables (i) to use risk bounds as training objectives for learning posterior distributions on the hypothesis space and (ii) to compute tight out-of-sample risk certificates of randomized classifiers more efficiently than related work. Comparison with empirical test set errors illustrates the performance and practicality of this self-certifying classification method.}, author = {Boll, Bastian and Zeilmann, Alexander and Petra, Stefania and Schn{\" o}rr, Christoph}, journaltitle = {arXiv}, date = {2022-01-28}, title = {Self-{Certifying} {Classification} by {Linearized} {Deep} {Assignment}}, url = {https://arxiv.org/abs/2201.11162}, }

## 2021

A. Zern,

DOI: 10.1007/s41884-021-00060-8

**A. Zeilmann**, C. Schnörr: Assignment flows for data labeling on graphs: convergence and stability. Information Geometry, November 2021DOI: 10.1007/s41884-021-00060-8

The assignment flow recently introduced in the

*J. Math. Imaging and Vision*58/2 (2017) constitutes a high-dimensional dynamical system that evolves on a statistical product manifold and performs contextual labeling (classification) of data given in a metric space. Vertices of an underlying corresponding graph index the data points and define a system of neighborhoods. These neighborhoods together with nonnegative weight parameters define the regularization of the evolution of label assignments to data points, through geometric averaging induced by the affine e-connection of information geometry. From the point of view of evolutionary game dynamics, the assignment flow may be characterized as a large system of replicator equations that are coupled by geometric averaging. This paper establishes conditions on the weight parameters that guarantee convergence of the continuous-time assignment flow to integral assignments (labelings), up to a negligible subset of situations that will not be encountered when working with real data in practice. Furthermore, we classify attractors of the flow and quantify corresponding basins of attraction. This provides convergence guarantees for the assignment flow which are extended to the discrete-time assignment flow that results from applying a Runge–Kutta–Munthe–Kaas scheme for the numerical geometric integration of the assignment flow. Several counter-examples illustrate that violating the conditions may entail unfavorable behavior of the assignment flow regarding contextual data classification.@article{Zern2021Assignment, author = {Zern, Artjom and Zeilmann, Alexander and Schn{\" o}rr, Christoph}, journal = {Information Geometry}, year = {2021}, month = {nov 18}, pages = {1--50}, publisher = {{Springer Science and Business Media LLC}}, title = {Assignment flows for data labeling on graphs: convergence and stability}, }

@article{Zern2021Assignment, abstract = {The assignment flow recently introduced in the \textit{J. Math. Imaging and Vision} 58/2 (2017) constitutes a high-dimensional dynamical system that evolves on a statistical product manifold and performs contextual labeling (classification) of data given in a metric space. Vertices of an underlying corresponding graph index the data points and define a system of neighborhoods. These neighborhoods together with nonnegative weight parameters define the regularization of the evolution of label assignments to data points, through geometric averaging induced by the affine e-connection of information geometry. From the point of view of evolutionary game dynamics, the assignment flow may be characterized as a large system of replicator equations that are coupled by geometric averaging. This paper establishes conditions on the weight parameters that guarantee convergence of the continuous-time assignment flow to integral assignments (labelings), up to a negligible subset of situations that will not be encountered when working with real data in practice. Furthermore, we classify attractors of the flow and quantify corresponding basins of attraction. This provides convergence guarantees for the assignment flow which are extended to the discrete-time assignment flow that results from applying a Runge--Kutta--Munthe--Kaas scheme for the numerical geometric integration of the assignment flow. Several counter-examples illustrate that violating the conditions may entail unfavorable behavior of the assignment flow regarding contextual data classification.}, author = {Zern, Artjom and Zeilmann, Alexander and Schn{\" o}rr, Christoph}, journaltitle = {Information Geometry}, shortjournal = {Info. Geo.}, doi = {10.1007/s41884-021-00060-8}, issn = {2511-2481}, date = {2021-11-18}, language = {en}, pages = {1--50}, publisher = {{Springer Science and Business Media LLC}}, title = {Assignment flows for data labeling on graphs: convergence and stability}, url = {http://dx.doi.org/10.1007/s41884-021-00060-8}, }

D. Gonzalez-Alvarado,

DOI: 10.1007/978-3-030-92659-5_29

**A. Zeilmann**, C. Schnörr: Quantifying Uncertainty of Image Labelings Using Assignment Flows. DAGM German Conference on Pattern Recognition, September 28–October 1, 2021DOI: 10.1007/978-3-030-92659-5_29

This paper introduces a novel approach to uncertainty quantification of image labelings determined by assignment flows. Local uncertainties caused by ambiguous data and noise are estimated by fitting Dirichlet distributions and pushed forward to the tangent space. The resulting first- and second-order moments are then propagated using a linear ODE parametrization of assignment flows. The corresponding moment evolution equations can be solved in closed form and numerically evaluated using iterative Krylov subspace techniques and low-rank approximation. This results in a faithful representation and quantification of uncertainty in the output space of image labelings, which is important in all applications where confidence in pixelwise decisions matters.

@inproceedings{Gonzalez2021Quantifying, author = {Gonzalez-Alvarado, Daniel and Zeilmann, Alexander and Schn{\" o}rr, Christoph}, booktitle = {}, year = {2021}, pages = {453--466}, organization = {Springer International Publishing}, title = {Quantifying {Uncertainty} of {Image} {Labelings} {Using} {Assignment} {Flows}}, }

@inproceedings{Gonzalez2021Quantifying, abstract = {This paper introduces a novel approach to uncertainty quantification of image labelings determined by assignment flows. Local uncertainties caused by ambiguous data and noise are estimated by fitting Dirichlet distributions and pushed forward to the tangent space. The resulting first- and second-order moments are then propagated using a linear ODE parametrization of assignment flows. The corresponding moment evolution equations can be solved in closed form and numerically evaluated using iterative Krylov subspace techniques and low-rank approximation. This results in a faithful representation and quantification of uncertainty in the output space of image labelings, which is important in all applications where confidence in pixelwise decisions matters.}, author = {Gonzalez-Alvarado, Daniel and Zeilmann, Alexander and Schn{\" o}rr, Christoph}, booktitle = {}, doi = {10.1007/978-3-030-92659-5_29}, venue = {Virtual Event}, date = {2021}, language = {en}, pages = {453--466}, publisher = {Springer International Publishing}, title = {Quantifying {Uncertainty} of {Image} {Labelings} {Using} {Assignment} {Flows}}, url = {https://doi.org/10.1007/978-3-030-92659-5_29}, }

**A. Zeilmann**, S. Petra, C. Schnörr: Learning Linearized Assignment Flows for Image Labeling. arXiv, August 2021. Preprint, accepted in JMIV

We introduce a novel algorithm for estimating optimal parameters of linearized assignment flows for image labeling. An exact formula is derived for the parameter gradient of any loss function that is constrained by the linear system of ODEs determining the linearized assignment flow. We show how to efficiently evaluate this formula using a Krylov subspace and a low-rank approximation. This enables us to perform parameter learning by Riemannian gradient descent in the parameter space, without the need to backpropagate errors or to solve an adjoint equation. Experiments demonstrate that our method performs as good as highly-tuned machine learning software using automatic differentiation. Unlike methods employing automatic differentiation, our approach yields a low-dimensional representation of internal parameters and their dynamics which helps to understand how assignment flows and more generally neural networks work and perform.

@article{Zeilmann2021Learning, author = {Zeilmann, Alexander and Petra, Stefania and Schn{\" o}rr, Christoph}, journal = {arXiv}, year = {2021}, month = {aug 2}, title = {Learning {Linearized} {Assignment} {Flows} for {Image} {Labeling}}, howpublished = {https://arxiv.org/abs/2108.02571}, }

@article{Zeilmann2021Learning, abstract = {We introduce a novel algorithm for estimating optimal parameters of linearized assignment flows for image labeling. An exact formula is derived for the parameter gradient of any loss function that is constrained by the linear system of ODEs determining the linearized assignment flow. We show how to efficiently evaluate this formula using a Krylov subspace and a low-rank approximation. This enables us to perform parameter learning by Riemannian gradient descent in the parameter space, without the need to backpropagate errors or to solve an adjoint equation. Experiments demonstrate that our method performs as good as highly-tuned machine learning software using automatic differentiation. Unlike methods employing automatic differentiation, our approach yields a low-dimensional representation of internal parameters and their dynamics which helps to understand how assignment flows and more generally neural networks work and perform.}, author = {Zeilmann, Alexander and Petra, Stefania and Schn{\" o}rr, Christoph}, journaltitle = {arXiv}, date = {2021-08-02}, title = {Learning {Linearized} {Assignment} {Flows} for {Image} {Labeling}}, url = {https://arxiv.org/abs/2108.02571}, }

S. Haupt,

DOI: 10.1371/journal.pcbi.1008970

**A. Zeilmann**, A. Ahadova, H. Bläker, M. von Knebel Doeberitz, M. Kloor, V. Heuveline: Mathematical modeling of multiple pathways in colorectal carcinogenesis using dynamical systems with Kronecker structure. PLOS Computational Biology, May 2021DOI: 10.1371/journal.pcbi.1008970

Like many other types of cancer, colorectal cancer (CRC) develops through multiple pathways of carcinogenesis. This is also true for colorectal carcinogenesis in Lynch syndrome (LS), the most common inherited CRC syndrome. However, a comprehensive understanding of the distribution of these pathways of carcinogenesis, which allows for tailored clinical treatment and even prevention, is still lacking. We suggest a linear dynamical system modeling the evolution of different pathways of colorectal carcinogenesis based on the involved driver mutations. The model consists of different components accounting for independent and dependent mutational processes. We define the driver gene mutation graphs and combine them using the Cartesian graph product. This leads to matrix components built by the Kronecker sum and product of the adjacency matrices of the gene mutation graphs enabling a thorough mathematical analysis and medical interpretation. Using the Kronecker structure, we developed a mathematical model which we applied exemplarily to the three pathways of colorectal carcinogenesis in LS. Beside a pathogenic germline variant in one of the DNA mismatch repair (MMR) genes, driver mutations in

*APC*,*CTNNB1*,*KRAS*and*TP53*are considered. We exemplarily incorporate mutational dependencies, such as increased point mutation rates after MMR deficiency, and based on recent experimental data, biallelic somatic*CTNNB1*mutations as common drivers of LS-associated CRCs. With the model and parameter choice, we obtained simulation results that are in concordance with clinical observations. These include the evolution of MMR-deficient crypts as early precursors in LS carcinogenesis and the influence of variants in MMR genes thereon. The proportions of MMR-deficient and MMR-proficient*APC*-inactivated crypts as first measure for the distribution among the pathways in LS-associated colorectal carcinogenesis are compatible with clinical observations. The approach provides a modular framework for modeling multiple pathways of carcinogenesis yielding promising results in concordance with clinical observations in LS CRCs.@article{Haupt2021Mathematical, author = {Haupt, Saskia and Zeilmann, Alexander and Ahadova, Aysel and Bl{\" a}ker, Hendrik and von Knebel Doeberitz, Magnus and Kloor, Matthias and Heuveline, Vincent}, journal = {PLOS Computational Biology}, editor = {Chen, Jing}, number = {5}, year = {2021}, month = {may 18}, pages = {e1008970}, publisher = {Public Library of Science (PLoS)}, title = {Mathematical modeling of multiple pathways in colorectal carcinogenesis using dynamical systems with {Kronecker} structure}, volume = {17}, }

@article{Haupt2021Mathematical, abstract = {Like many other types of cancer, colorectal cancer (CRC) develops through multiple pathways of carcinogenesis. This is also true for colorectal carcinogenesis in Lynch syndrome (LS), the most common inherited CRC syndrome. However, a comprehensive understanding of the distribution of these pathways of carcinogenesis, which allows for tailored clinical treatment and even prevention, is still lacking. We suggest a linear dynamical system modeling the evolution of different pathways of colorectal carcinogenesis based on the involved driver mutations. The model consists of different components accounting for independent and dependent mutational processes. We define the driver gene mutation graphs and combine them using the Cartesian graph product. This leads to matrix components built by the Kronecker sum and product of the adjacency matrices of the gene mutation graphs enabling a thorough mathematical analysis and medical interpretation. Using the Kronecker structure, we developed a mathematical model which we applied exemplarily to the three pathways of colorectal carcinogenesis in LS. Beside a pathogenic germline variant in one of the DNA mismatch repair (MMR) genes, driver mutations in \textit{APC}, \textit{CTNNB1}, \textit{KRAS} and \textit{TP53} are considered. We exemplarily incorporate mutational dependencies, such as increased point mutation rates after MMR deficiency, and based on recent experimental data, biallelic somatic \textit{CTNNB1} mutations as common drivers of LS-associated CRCs. With the model and parameter choice, we obtained simulation results that are in concordance with clinical observations. These include the evolution of MMR-deficient crypts as early precursors in LS carcinogenesis and the influence of variants in MMR genes thereon. The proportions of MMR-deficient and MMR-proficient \textit{APC}-inactivated crypts as first measure for the distribution among the pathways in LS-associated colorectal carcinogenesis are compatible with clinical observations. The approach provides a modular framework for modeling multiple pathways of carcinogenesis yielding promising results in concordance with clinical observations in LS CRCs.}, author = {Haupt, Saskia and Zeilmann, Alexander and Ahadova, Aysel and Bl{\" a}ker, Hendrik and von Knebel Doeberitz, Magnus and Kloor, Matthias and Heuveline, Vincent}, journaltitle = {PLOS Computational Biology}, shortjournal = {PLoS Comput Biol}, doi = {10.1371/journal.pcbi.1008970}, editor = {Chen, Jing}, issn = {1553-7358}, number = {5}, date = {2021-05-18}, language = {en}, pages = {e1008970}, eid = {e1008970}, eprint = {34003820}, eprinttype = {pubmed}, publisher = {Public Library of Science (PLoS)}, title = {Mathematical modeling of multiple pathways in colorectal carcinogenesis using dynamical systems with {Kronecker} structure}, url = {http://dx.doi.org/10.1371/journal.pcbi.1008970}, volume = {17}, }

**A. Zeilmann**, S. Petra, C. Schnörr: Learning Linear Assignment Flows for Image Labeling via Exponential Integration. Scale Space and Variational Methods in Computer Vision, May 16–20, 2021

DOI: 10.1007/978-3-030-75549-2_31

We introduce a novel algorithm for estimating optimal parameters of linear assignment flows for image labeling. This flow is determined by the solution of a linear ODE in terms of a high-dimensional integral. A formula of the gradient of the solution with respect to the flow parameters is derived and approximated using Krylov subspace techniques. Riemannian descent in the parameter space enables to determine optimal parameters for a 512×512 image in less than 10s, without the need to backpropagate errors or to solve an adjoint equation. Numerical experiments demonstrate a high generative model expressivity despite the linearity of the assignment flow parametrization.

@inproceedings{Zeilmann2021Learning, author = {Zeilmann, Alexander and Petra, Stefania and Schn{\" o}rr, Christoph}, booktitle = {Lecture {Notes} in {Computer} {Science}}, year = {2021}, month = {apr 30}, pages = {385--397}, organization = {Springer International Publishing}, title = {Learning {Linear} {Assignment} {Flows} for {Image} {Labeling} via {Exponential} {Integration}}, }

@inproceedings{Zeilmann2021Learning, abstract = {We introduce a novel algorithm for estimating optimal parameters of linear assignment flows for image labeling. This flow is determined by the solution of a linear ODE in terms of a high-dimensional integral. A formula of the gradient of the solution with respect to the flow parameters is derived and approximated using Krylov subspace techniques. Riemannian descent in the parameter space enables to determine optimal parameters for a 512\texttimes{}512 image in less than 10s, without the need to backpropagate errors or to solve an adjoint equation. Numerical experiments demonstrate a high generative model expressivity despite the linearity of the assignment flow parametrization.}, author = {Zeilmann, Alexander and Petra, Stefania and Schn{\" o}rr, Christoph}, booktitle = {Lecture {Notes} in {Computer} {Science}}, doi = {10.1007/978-3-030-75549-2_31}, venue = {Virtual Event}, issn = {0302-9743}, date = {2021-04-30}, pages = {385--397}, publisher = {Springer International Publishing}, title = {Learning {Linear} {Assignment} {Flows} for {Image} {Labeling} via {Exponential} {Integration}}, url = {http://dx.doi.org/10.1007/978-3-030-75549-2_31}, }

## 2020

S. Haupt,

DOI: 10.1101/2020.08.14.250175

**A. Zeilmann**, A. Ahadova, M. von Knebel Doeberitz, M. Kloor, V. Heuveline: Mathematical Modeling of Multiple Pathways in Colorectal Carcinogenesis using Dynamical Systems with Kronecker Structure. bioRxiv, August 2020. PreprintDOI: 10.1101/2020.08.14.250175

Like many other tumors, colorectal cancers develop through multiple pathways containing different driver mutations. This is also true for colorectal carcinogenesis in Lynch syndrome, the most common inherited colorectal cancer syndrome. However, a comprehensive understanding of Lynch syndrome tumor evolution which allows for tailored clinical treatment and even prevention is still lacking. We suggest a linear autonomous dynamical system modeling the evolution of the different pathways. Starting with the gene mutation graphs of the driver genes, we formulate three key assumptions about how these different mutations might be combined. This approach leads to a dynamical system that is built by the Kronecker sum of the adjacency matrices of the gene mutation graphs. This Kronecker structure makes the dynamical system amenable to a thorough mathematical analysis and medical interpretation, even if the number of incorporated genes or possible mutation states is increased. For the case that some of the mathematical key assumptions are not satisfied, we explain possible extensions to our model. Additionally, improved bio-medical measurements or novel medical insights can be integrated into the model in a straightforward manner, as all parameters in the model have a biological interpretation. Modifications of the model are able to account for other forms of colorectal carcinogenesis, such as Lynch-like and familial adenomatous polyposis cases.

@article{Haupt2020Mathematical, author = {Haupt, Saskia and Zeilmann, Alexander and Ahadova, Aysel and von Knebel Doeberitz, Magnus and Kloor, Matthias and Heuveline, Vincent}, journal = {bioRxiv}, year = {2020}, month = {aug 14}, publisher = {Cold Spring Harbor Laboratory}, title = {Mathematical {Modeling} of {Multiple} {Pathways} in {Colorectal} {Carcinogenesis} using {Dynamical} {Systems} with {Kronecker} {Structure}}, }

@article{Haupt2020Mathematical, abstract = {Like many other tumors, colorectal cancers develop through multiple pathways containing different driver mutations. This is also true for colorectal carcinogenesis in Lynch syndrome, the most common inherited colorectal cancer syndrome. However, a comprehensive understanding of Lynch syndrome tumor evolution which allows for tailored clinical treatment and even prevention is still lacking. We suggest a linear autonomous dynamical system modeling the evolution of the different pathways. Starting with the gene mutation graphs of the driver genes, we formulate three key assumptions about how these different mutations might be combined. This approach leads to a dynamical system that is built by the Kronecker sum of the adjacency matrices of the gene mutation graphs. This Kronecker structure makes the dynamical system amenable to a thorough mathematical analysis and medical interpretation, even if the number of incorporated genes or possible mutation states is increased. For the case that some of the mathematical key assumptions are not satisfied, we explain possible extensions to our model. Additionally, improved bio-medical measurements or novel medical insights can be integrated into the model in a straightforward manner, as all parameters in the model have a biological interpretation. Modifications of the model are able to account for other forms of colorectal carcinogenesis, such as Lynch-like and familial adenomatous polyposis cases.}, author = {Haupt, Saskia and Zeilmann, Alexander and Ahadova, Aysel and von Knebel Doeberitz, Magnus and Kloor, Matthias and Heuveline, Vincent}, journaltitle = {bioRxiv}, doi = {10.1101/2020.08.14.250175}, date = {2020-08-14}, publisher = {Cold Spring Harbor Laboratory}, title = {Mathematical {Modeling} of {Multiple} {Pathways} in {Colorectal} {Carcinogenesis} using {Dynamical} {Systems} with {Kronecker} {Structure}}, url = {http://dx.doi.org/10.1101/2020.08.14.250175}, }

**A. Zeilmann**, F. Savarino, S. Petra, C. Schnörr: Geometric numerical integration of the assignment flow. Inverse Problems, February 2020

DOI: 10.1088/1361-6420/ab2772

The assignment flow is a smooth dynamical system that evolves on an elementary statistical manifold and performs contextual data labeling on a graph. We derive and introduce the linear assignment flow that evolves nonlinearly on the manifold, but is governed by a linear ODE on the tangent space. Various numerical schemes adapted to the mathematical structure of these two models are designed and studied, for the geometric numerical integration of both flows: embedded Runge–Kutta–Munthe–Kaas schemes for the nonlinear flow, adaptive Runge–Kutta schemes and exponential integrators for the linear flow. All algorithms are parameter free, except for setting a tolerance value that specifies adaptive step size selection by monitoring the local integration error, or fixing the dimension of the Krylov subspace approximation. These algorithms provide a basis for applying the assignment flow to machine learning scenarios beyond supervised labeling, including unsupervised labeling and learning from controlled assignment flows.

@article{Zeilmann2020Geometric, author = {Zeilmann, Alexander and Savarino, Fabrizio and Petra, Stefania and Schn{\" o}rr, Christoph}, journal = {Inverse Problems}, number = {3}, year = {2020}, month = {feb 26}, pages = {034003}, publisher = {IOP Publishing}, title = {Geometric numerical integration of the assignment flow}, volume = {36}, }

@article{Zeilmann2020Geometric, abstract = {The assignment flow is a smooth dynamical system that evolves on an elementary statistical manifold and performs contextual data labeling on a graph. We derive and introduce the linear assignment flow that evolves nonlinearly on the manifold, but is governed by a linear ODE on the tangent space. Various numerical schemes adapted to the mathematical structure of these two models are designed and studied, for the geometric numerical integration of both flows: embedded Runge--Kutta--Munthe--Kaas schemes for the nonlinear flow, adaptive Runge--Kutta schemes and exponential integrators for the linear flow. All algorithms are parameter free, except for setting a tolerance value that specifies adaptive step size selection by monitoring the local integration error, or fixing the dimension of the Krylov subspace approximation. These algorithms provide a basis for applying the assignment flow to machine learning scenarios beyond supervised labeling, including unsupervised labeling and learning from controlled assignment flows.}, author = {Zeilmann, Alexander and Savarino, Fabrizio and Petra, Stefania and Schn{\" o}rr, Christoph}, journaltitle = {Inverse Problems}, shortjournal = {Inverse Problems}, doi = {10.1088/1361-6420/ab2772}, issn = {0266-5611}, number = {3}, date = {2020-02-26}, pages = {034003}, publisher = {IOP Publishing}, title = {Geometric numerical integration of the assignment flow}, url = {http://dx.doi.org/10.1088/1361-6420/ab2772}, volume = {36}, }

A. Zern,

**A. Zeilmann**, C. Schnörr: Assignment Flows for Data Labeling on Graphs: Convergence and Stability. arXiv, February 2020. PreprintThe assignment flow recently introduced in the J. Math. Imaging and Vision 58/2 (2017), constitutes a high-dimensional dynamical system that evolves on an elementary statistical manifold and performs contextual labeling (classification) of data given in any metric space. Vertices of a given graph index the data points and define a system of neighborhoods. These neighborhoods together with nonnegative weight parameters define regularization of the evolution of label assignments to data points, through geometric averaging induced by the affine e-connection of information geometry. Regarding evolutionary game dynamics, the assignment flow may be characterized as a large system of replicator equations that are coupled by geometric averaging. This paper establishes conditions on the weight parameters that guarantee convergence of the continuous-time assignment flow to integral assignments (labelings), up to a negligible subset of situations that will not be encountered when working with real data in practice. Furthermore, we classify attractors of the flow and quantify corresponding basins of attraction. This provides convergence guarantees for the assignment flow which are extended to the discrete-time assignment flow that results from applying a Runge-Kutta-Munthe-Kaas scheme for numerical geometric integration of the assignment flow. Several counter-examples illustrate that violating the conditions may entail unfavorable behavior of the assignment flow regarding contextual data classification.

@article{Zern2020Assignment, author = {Zern, Artjom and Zeilmann, Alexander and Schn{\" o}rr, Christoph}, journal = {arXiv}, year = {2020}, month = {feb 26}, title = {Assignment {Flows} for {Data} {Labeling} on {Graphs}: Convergence and {Stability}}, howpublished = {https://arxiv.org/abs/2002.11571}, }

@article{Zern2020Assignment, abstract = {The assignment flow recently introduced in the J. Math. Imaging and Vision 58/2 (2017), constitutes a high-dimensional dynamical system that evolves on an elementary statistical manifold and performs contextual labeling (classification) of data given in any metric space. Vertices of a given graph index the data points and define a system of neighborhoods. These neighborhoods together with nonnegative weight parameters define regularization of the evolution of label assignments to data points, through geometric averaging induced by the affine e-connection of information geometry. Regarding evolutionary game dynamics, the assignment flow may be characterized as a large system of replicator equations that are coupled by geometric averaging. This paper establishes conditions on the weight parameters that guarantee convergence of the continuous-time assignment flow to integral assignments (labelings), up to a negligible subset of situations that will not be encountered when working with real data in practice. Furthermore, we classify attractors of the flow and quantify corresponding basins of attraction. This provides convergence guarantees for the assignment flow which are extended to the discrete-time assignment flow that results from applying a Runge-Kutta-Munthe-Kaas scheme for numerical geometric integration of the assignment flow. Several counter-examples illustrate that violating the conditions may entail unfavorable behavior of the assignment flow regarding contextual data classification.}, author = {Zern, Artjom and Zeilmann, Alexander and Schn{\" o}rr, Christoph}, journaltitle = {arXiv}, date = {2020-02-26}, title = {Assignment {Flows} for {Data} {Labeling} on {Graphs}: Convergence and {Stability}}, url = {https://arxiv.org/abs/2002.11571}, }

## 2019

**A. Zeilmann**, F. Savarino, S. Petra, C. Schnörr: Exponential Integration of the Linear Assignment Flow. PAMM, September 2019

DOI: 10.1002/pamm.201900434

We introduce the linear assignment flow as an approximation of the full nonlinear assignment flow, which is a method for contextual data labeling on arbitrary graphs. The linear assignment flow is a dynamical system evolving on the tangent space of a statistical manifold. It is numerically determined using exponential integrators and Krylov subspace approximation, for which we provide error estimates. The approximation property of the linear assignment flow is illustrated by a numerical experiment. This work is supplemented by two papers on variational modeling and unsupervised labeling [1].

@article{Zeilmann2019Exponential, author = {Zeilmann, Alexander and Savarino, Fabrizio and Petra, Stefania and Schn{\" o}rr, Christoph}, journal = {PAMM}, number = {1}, year = {2019}, month = {sep 4}, publisher = {Wiley}, title = {Exponential {Integration} of the {Linear} {Assignment} {Flow}}, volume = {19}, }

@article{Zeilmann2019Exponential, abstract = {We introduce the linear assignment flow as an approximation of the full nonlinear assignment flow, which is a method for contextual data labeling on arbitrary graphs. The linear assignment flow is a dynamical system evolving on the tangent space of a statistical manifold. It is numerically determined using exponential integrators and Krylov subspace approximation, for which we provide error estimates. The approximation property of the linear assignment flow is illustrated by a numerical experiment. This work is supplemented by two papers on variational modeling and unsupervised labeling [1].}, author = {Zeilmann, Alexander and Savarino, Fabrizio and Petra, Stefania and Schn{\" o}rr, Christoph}, journaltitle = {PAMM}, shortjournal = {Proc. Appl. Math. Mech.}, doi = {10.1002/pamm.201900434}, issn = {1617-7061}, number = {1}, date = {2019-09-04}, language = {en}, publisher = {Wiley}, title = {Exponential {Integration} of the {Linear} {Assignment} {Flow}}, url = {http://dx.doi.org/10.1002/pamm.201900434}, volume = {19}, }

L. Cerrone,

DOI: 10.1109/CVPR.2019.01284

**A. Zeilmann**, F. Hamprecht: End-to-End Learned Random Walker for Seeded Image Segmentation. IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), June 15–20, 2019DOI: 10.1109/CVPR.2019.01284

We present an end-to-end learned algorithm for seeded segmentation. Our method is based on the Random Walker algorithm, where we predict the edge weights of the underlying graph using a convolutional neural network. This can be interpreted as learning context-dependent diffusivities for a linear diffusion process. Besides calculating the exact gradient for optimizing these diffusivities, we also propose simplifications that sparsely sample the gradient and still yield competitive results. The proposed method achieves the currently best results on a seeded version of the CREMI neuron segmentation challenge.

@inproceedings{Cerrone2019Walker, address = {Long Beach, California}, author = {Cerrone, Lorenzo and Zeilmann, Alexander and Hamprecht, Fred A.}, year = {2019}, month = {jun 16}, pages = {12559--12568}, organization = {IEEE}, title = {End-to-{End} {Learned} {Random} {Walker} for {Seeded} {Image} {Segmentation}}, }

@inproceedings{Cerrone2019Walker, abstract = {We present an end-to-end learned algorithm for seeded segmentation. Our method is based on the Random Walker algorithm, where we predict the edge weights of the underlying graph using a convolutional neural network. This can be interpreted as learning context-dependent diffusivities for a linear diffusion process. Besides calculating the exact gradient for optimizing these diffusivities, we also propose simplifications that sparsely sample the gradient and still yield competitive results. The proposed method achieves the currently best results on a seeded version of the CREMI neuron segmentation challenge.}, author = {Cerrone, Lorenzo and Zeilmann, Alexander and Hamprecht, Fred A.}, doi = {10.1109/CVPR.2019.01284}, venue = {Long Beach, CA, USA}, isbn = {978-1-72813-293-8}, date = {2019-06-16/2019-06-20}, keywords = {Computer Science - Computer Vision and Pattern Recognition,Computer Science - Machine Learning,ownPublication}, pages = {12559--12568}, location = {Long Beach, California}, publisher = {IEEE}, title = {End-to-{End} {Learned} {Random} {Walker} for {Seeded} {Image} {Segmentation}}, url = {https://ieeexplore.ieee.org/document/8954100}, }

## 2018

**A. Zeilmann**, F. Savarino, S. Petra, C. Schnörr: Geometric Numerical Integration of the Assignment Flow. arXiv, October 2018. Preprint

The assignment flow is a smooth dynamical system that evolves on an elementary statistical manifold and performs contextual data labeling on a graph. We derive and introduce the linear assignment flow that evolves nonlinearly on the manifold, but is governed by a linear ODE on the tangent space. Various numerical schemes adapted to the mathematical structure of these two models are designed and studied, for the geometric numerical integration of both flows: embedded Runge-Kutta-Munthe-Kaas schemes for the nonlinear flow, adaptive Runge-Kutta schemes and exponential integrators for the linear flow. All algorithms are parameter free, except for setting a tolerance value that specifies adaptive step size selection by monitoring the local integration error, or fixing the dimension of the Krylov subspace approximation. These algorithms provide a basis for applying the assignment flow to machine learning scenarios beyond supervised labeling, including unsupervised labeling and learning from controlled assignment flows.

@article{Zeilmann2018Geometric, author = {Zeilmann, Alexander and Savarino, Fabrizio and Petra, Stefania and Schn{\" o}rr, Christoph}, journal = {arXiv}, year = {2018}, month = {oct 5}, title = {Geometric {Numerical} {Integration} of the {Assignment} {Flow}}, howpublished = {https://arxiv.org/abs/1810.06970}, }

@article{Zeilmann2018Geometric, abstract = {The assignment flow is a smooth dynamical system that evolves on an elementary statistical manifold and performs contextual data labeling on a graph. We derive and introduce the linear assignment flow that evolves nonlinearly on the manifold, but is governed by a linear ODE on the tangent space. Various numerical schemes adapted to the mathematical structure of these two models are designed and studied, for the geometric numerical integration of both flows: embedded Runge-Kutta-Munthe-Kaas schemes for the nonlinear flow, adaptive Runge-Kutta schemes and exponential integrators for the linear flow. All algorithms are parameter free, except for setting a tolerance value that specifies adaptive step size selection by monitoring the local integration error, or fixing the dimension of the Krylov subspace approximation. These algorithms provide a basis for applying the assignment flow to machine learning scenarios beyond supervised labeling, including unsupervised labeling and learning from controlled assignment flows.}, author = {Zeilmann, Alexander and Savarino, Fabrizio and Petra, Stefania and Schn{\" o}rr, Christoph}, journaltitle = {arXiv}, date = {2018-10-05}, title = {Geometric {Numerical} {Integration} of the {Assignment} {Flow}}, url = {https://arxiv.org/abs/1810.06970}, }

Thank you to all my coauthors!