Variational methods, new optimisation techniques and new fast numerical algorithms
Monday 4th September 2017 to Friday 8th September 2017
09:00 to 09:40  Registration  
09:40 to 09:50  Welcome from Christie Marr (INI Deputy Director)  
09:50 to 10:40 
Joachim Weickert Efficient and Stable Schemes for 2D ForwardandBackward Diffusion
Coauthor: Martin Welk (UMIT Hall, Austria) Image enhancement with forwardandbackward (FAB) diffusion is numerically very challenging due to its negative diffusivities. As a remedy, we first extend the explicit nonstandard scheme by Welk et al. (2009) from the 1D scenario to the practically relevant twodimensional setting. We prove that under a fairly severe time step restriction, this 2D scheme preserves a maximumminimum principle. Moreover, we find an interesting Lyapunov sequence which guarantees convergence to a flat steady state. Since a global application of the time step size restriction leads to very slow algorithms and is more restrictive than necessary for most pixels, we introduce a much more efficient scheme with locally adapted time step sizes. It applies diffusive interactions of adjacent pixel pairs in a randomized order and adapts the time step size locally. These spacevariant time steps are synchronized at sync times which are determined by stability properties of the explicit forward diffusion scheme. Experiments show that our novel twopixel scheme allows to compute FAB diffusion with guaranteed stability in the maximum norm at a speed that can be three orders of magnitude larger than its explicit counterpart with a global time step size. 
INI 1  
10:40 to 11:10  Morning Coffee  
11:10 to 12:00 
Yuri Boykov Spectral Clustering meets Graphical Models
Coauthors: Dmitri Marin (UWO), Meng Tang (UWO), Ismail Ben Ayed (ETS, Montreal) This talk discusses two seemingly unrelated data analysis methodologies: kernel clustering and graphical models. Clustering is widely used for general data where kernel methods are particularly popular due to their discriminating power. Graphical models such as Markov Random Fields (MRF) and related continuous geometric methods represent the stateoftheart regularization methodology for image segmentation. While both clustering and regularization models are very widely used in machine learning and computer vision, they were not combined before due to significant differences in the corresponding optimization, e.g. spectral relaxation vs. combinatorial methods for submodular optimization and its approximations. This talk reviews the general properties of kernel clustering and graphical models, discusses their limitations (including newly discovered "density biases" in kernel methods), and proposes a general unified framework based on our new bound optimization algor ithm. In particular, we show that popular MRF potentials introduce principled geometric and contextual constraints into clustering, while standard kernel methodology allows graphical models to work with arbitrary highdimensional features. Related Links

INI 1  
12:00 to 12:50 
Alfred Bruckstein On Overparametrization in Variational Methods
The talk will survey the idea of using overparametrization in variational methods, in cases when some parameterized models for the signals of interest are available. Recently such methods were observed to yield stateofthe art results in recovering optic flow fields and in some signal segmentation problems.

INI 1  
12:50 to 14:00  Lunch @ Wolfson Court  
14:00 to 14:50 
Martin Burger Nonlinear Spectral Decomposition
In this talk we will discuss nonlinear spectral decompositions in Banach spaces, which shed a new light on multiscale methods in imaging and open new possibilities of filtering techniques. We provide a novel geometric interpretation of nonlinear eigenvalue problems in Banach spaces and provide conditions under which gradient flows for norms or seminorms yield a spectral decomposition. We will see that under these conditions standard variational schemes are equivalent to the gradient flows for arbitrary large time step, recovering previous results e.g. for the one dimensional total variation flow as special cases. The talk is based on joint work with Guy Gilboa, Michael Moeller, Martin Benning, Daniel Cremers, Lina Eckardt 
INI 1  
14:50 to 15:40 
Guy Gilboa Nonlinear spectral analysis  beyond the convex case
A brief overview will be given on current results in nonlinear eigenvalue analysis for one homogeneous functionals. We will then discuss how one can go beyond the convex framework by analyzing decay patterns of iterative filtering, based on sparsity constraints. Related Links 
INI 1  
15:40 to 16:10  Afternoon Tea  
16:10 to 17:00 
JeanFrancois Aujol Video colorization by a variational approach
This work provides a new method to colorize grayscale images. While the reverse operation is only a matter of standard, the colorization process is an illposed problem that requires some priors. In the literature two classes of approach exist. The first class includes manual methods that needs the user to manually add colorson the image to colorize. The second class includes exemplarbased approacheswhere a color image, with a similar semantic content, is provided as input to the method.These two types of priors have their own advantages and drawbacks. In this work, a new variational framework for exemplarbased colorization is proposed. A nonlocal approach is used to find relevant color in the source image in order to suggest colors on the grayscale image. The spatial coherency of the result as well as the final color selection is provided by a nonconvex variational framework based on a total variation. An efficient primaldual algorithm. In this work, we also extend the proposed exemplarbased approach to combine both exemplarbased and manual methods. It provides a single framework that unifies advantages of both approaches. Finally,experiments and comparisons with stateoftheart methods illustrate the efficiency of our method. This is joint work with Fabien Pierre, Aurélie Bugeau, Nicolas Papadakis, and Vinh Tong Ta. 
INI 1  
17:00 to 17:10  Break / Networking  
17:10 to 18:10  Welcome Wine Reception at INI 
09:00 to 09:50 
Antonin Chambolle Minimization of curvature dependent functional.
In this joint work with T. Pock (TU Graz, Austria) we present a relaxation of line energies which depend on the curvature, such as the elastica functional, introduced in particular to complete contours in image inpainting problems. Our relaxation is convex and tight on C^2 curves. 
INI 1  
09:50 to 10:40 
Ke Chen Fractional Order Derivatives Regularization: Models, Algorithms and Applications
In variational imaging and other inverse problem modeling, regularisation plays a major role.In recent years, high order regularizers such as the mean curvature, the Gaussian curvature and Euler's elastica are increasingly studied and applied, and many impressive results over the widelyused gradient based models are reported. Here we present some results from studying another class of high and noninteger order regularisers based on fractional order derivatives and focus on two aspects of this class of models:(i) theoretical analysis and advantages; (ii) efficient algorithms.We found that models with regularization by fractional order derivatives are convex in a suitable space and algorithms exploiting structured matrices can be employed to design efficient algorithms.Applications to restoration and registration are illustrated. This opens many opportunities to apply these regularisers to a wide class of imaging problems. Ke Chen and J P Zhang, EPSRC Liverpool Centre for Mathematics in Healthcare,Centre for Mathematical Imaging Techniques, and Department of Mathematical Sciences,The University of Liverpool,United Kingdom[ http://tinyurl.com/EPSRCLCMH ] 
INI 1  
10:40 to 11:10  Morning Coffee  
11:10 to 12:00 
Michael Ng Tensor Data Analysis: Models and Algorithms
In this talk, we discuss some models and algorithms for tensor data analysis. Examples in imaging sciences are presented to illustrate the results of the proposed models and algorithms. 
INI 1  
12:00 to 12:50 
Kristian Bredies Preconditioned and accelerated DouglasRachford algorithms for the solution of variational imaging problems
Coauthor: Hongpeng Sun (Renmin University of China) We present preconditioned and accelerated versions of the DouglasRachford (DR) splitting method for the solution of convexconcave saddlepoint problems which often arise in variational imaging. The methods enable to replace the solution of a linear system in each iteration step in the corresponding DR iteration by approximate solvers without the need of controlling the error. These iterations are shown to converge in Hilbert space under minimal assumptions on the preconditioner and for any stepsize. Moreover, ergodic sequences associated with the iteration admit at least a convergence rate in terms of restricted primaldual gaps. Further, strong convexity of one or both of the involved functionals allow for acceleration strategies that yield improved rates of and for , respectively. The methods are applied to nonsmooth and convex variational imaging problems. We discuss denoising and deconvolution with and discrepancy and total variation (TV) as well as total generalized variation (TGV) penalty. Preconditioners which are specific to these problems are presented, the results of numerical experiments are shown and the benefits of the respective preconditioned iterations are discussed. 
INI 1  
12:30 to 18:00 
Computational Challenges in Image Processing  http://www.turinggateway.cam.ac.uk/event/ofbw32 

12:50 to 14:00  Buffet Lunch at INI 
09:00 to 09:50 
Laurent Cohen Geodesic Methods for Interactive Image Segmentation using Finsler metrics
Minimal paths have been
used for long as an interactive tool to find edges or tubular structures as
cost minimizing curves. The user usually provides start and end points on the
image and gets the minimal path as output. These minimal paths correspond to
minimal geodesics according to some adapted metric. They are a way to find a
(set of) curve(s) globally minimizing the geodesic active contours energy.
Finding a geodesic distance can be solved by the Eikonal equation using the
fast and efficient Fast Marching method. Different metrics can be adapted to various problems. In the past years we have introduced different extensions of these minimal paths that improve either the interactive aspects or the results. For example, the metric can take into account both scale and orientation of the path. This leads to solving an anisotropic minimal path in a 2D or 3D+radius space. We recently introduced the use of Finsler metrics allowing to take into account the local curvature in order to smooth the path. It can also be adapted to take into account a region term inside the closed curve formed by a set of minimal geodesics. Coauthors: Da Chen and J.M. Mirebeau 
INI 1  
09:50 to 10:40 
Tom Goldstein Automating stochastic gradient methods with adaptive batch sizes
This talk will address several issues related to training neural networks using stochastic gradient methods. First, we'll talk about the difficulties of training in a distributed environment, and present a new method called centralVR for boosting the scalability of training methods. Then, we'll talk about the issue of automating stochastic gradient descent, and show that learning rate selection can be simplified using "Big Batch" strategies that adaptively choose minibatch sizes. 
INI 1  
10:40 to 11:10  Morning Coffee  
11:10 to 12:00 
Vladimir Kolmogorov Valued Constraint Satisfaction Problems
I will consider the Valued Constraint Satisfaction Problem (VCSP), whose goal is to minimize a sum of local terms where each term comes from a fixed set of functions (called a "language") over a fixed discrete domain. I will present recent results characterizing languages that can be solved using the basic LP relaxation. This includes languages consisting of submodular functions, as well as their generalizations. One of such generalizations is ksubmodular functions. In the second part of the talk I will present an application of such functions in computer vision. Based on joint papers with Igor Gridchyn, Andrei Krokhin, Michal Rolinek, Johan Thapper and Stanislav Zivny. 
INI 1  
12:00 to 12:50 
Sung Ha Kang Efficient numerical Methods For Variational inpainting models
Coauthors: Maryam Yashtini (Georgia Institute of Technology), Wei Zhu (The University of Alabama) Recent developments of fast algorithms, based on operator splitting, augmented Lagrangian, and alternating minimization, enabled us to revisit some of the variational image inpainting models. In this talk, we will present some fast algorithms for Euler's Elastica image inpainting model, and variational edgeweighted image colorization model based on chromaticity and brightness models. Main ideas of the models and algorithms, some analysis and numerical results will be presented. 
INI 1  
12:50 to 14:00  Lunch @ Wolfson Court  
14:00 to 14:50 
Jalal Fadili Sensitivity Analysis with Degeneracy: Mirror Stratifiable Functions
This talk will present a set of sensitivity analysis and activity identification results for a class of convex functions with a strong geometric structure, that we coin ``mirrorstratifiable''. These functions are such that there is a bijection between a primal and a dual stratification of the space into partitioning sets, called strata. This pairing is crucial to track the strata that are identifiable by solutions of parametrized optimization problems or by iterates of optimization algorithms. This class of functions encompasses all regularizers routinely used in signal and image processing, machine learning, and statistics. We show that this ``mirrorstratifiable'' structure enjoys a nice sensitivity theory, allowing us to study stability of solutions of optimization problems to small perturbations, as well as activity identification of firstorder proximal splittingtype algorithms. Existing results in the literature typically assume that, under a nondegeneracy condition, the active set associated to a minimizer is stable to small perturbations and is identified in finite time by optimization schemes. In contrast, our results do not require any nondegeneracy assumption: in consequence, the optimal active set is not necessarily stable anymore, but we are able to track precisely the set of identifiable strata. We show that these results have crucial implications when solving challenging illposed inverse problems via regularization, a typical scenario where the nondegeneracy condition is not fulfilled. Our theoretical results, illustrated by numerical simulations, allow to characterize the instability behaviour of the regularized solutions, by locating the set of all lowdimensional strata that can be potentially identified by these solutions. This is a joint work with Jérôme Malick and Gabriel Peyré. 
INI 1  
14:50 to 15:40 
Zuoqiang Shi Low dimensional manifold model for image processing
In this talk, I will introduce a novel low dimensional
manifold model for image processing problem.
This model is based on the observation that for many natural images, the patch manifold usually has low dimension structure. Then, we use the dimension of the patch manifold as a regularization to recover the original image. Using some formula in differential geometry, this problem is reduced to solve LaplaceBeltrami equation on manifold. The LaplaceBeltrami equation is solved by the point integral method. Numerical tests show that this method gives very good results in image inpainting, denoising and superresolution problem. This is joint work with Stanley Osher and Wei Zhu. 
INI 1  
15:40 to 16:10  Afternoon Tea  
16:10 to 17:00 
Gabriele Steidl Convex Analysis in Hadamard Spaces
joint work with M. Bacak, R. Bergmann, M. Montag and J. Persch The aim of the talk is twofold: 1. A well known result of H. Attouch states that the Mosco convergence of a sequence of proper convex lower semicontinuous functions defined on a Hilbert space is equivalent to the pointwise convergence of the associated Moreau envelopes. In the present paper we generalize this result to Hadamard spaces. More precisely, while it has already been known that the Mosco convergence of a sequence of convex lower semicontinuous functions on a Hadamard space implies the pointwise convergence of the corresponding Moreau envelopes, the converse implication was an open question. We now fill this gap. Our result has several consequences. It implies, for instance, the equivalence of the Mosco and FrolikWijsman convergences of convex sets. As another application, we show that there exists a~complete metric on the cone of proper convex lower semicontinuous functions on a separable Hadamard space such that a~sequence of functions converges in this metric if and only if it converges in the sense of Mosco. 2. We extend the parallel DouglasRachford algorithm to the manifoldvalued setting. 
INI 1  
19:30 to 22:00  Formal Dinner at Emmanuel College 
09:00 to 09:50 
XueCheng Tai Fast Algorithms for Euler´s Elastica energy minimization and applications
This talk is divided into three parts.
In the first part, we will introduce the essential ideas in using Augmented Lagrangian/operatorsplitting techniques for fast numerical algorithms for minimizing Euler's Elastica energy.
In the 2nd part, we consider an Euler's elastica based image segmentation model. An interesting feature of this model lies in its preference of convex segmentation contour. However,
due to the high order and nondifferentiable term, it is often nontrivial to minimize the
associated functional. In this work, we propose using augmented Lagrangian
method to tackle the minimization problem. Especially, we design a novel augmented
Lagrangian functional that deals with the mean curvature term differently as those ones in the
previous works. The new treatment reduces the number of Lagrange multipliers employed,
and more importantly, it helps represent the curvature more effectively and faithfully.
Numerical experiments validate the efficiency of the proposed augmented Lagrangian method
and also demonstrate new features of this particular segmentation model, such as shape
driven and data driven properties.
In the 3rd part, we will introduce some recent fast algorithms for minimizing Euler's elastica energy for interface problems. The method combine level set and binary representations of interfaces. The algorithm only needs to solve an RodinOsherFatemi problem and a redistance of the level set function to minimize the elastica energy. The algorithm is easy to implement and fast with efficiency.
The content of this talk is based joint works with Egil Bae, Tony Chan, Jinming Duan and Wei Zhu.
Related links:
1) ftp://ftp.math.ucla.edu/pub/camreport/cam1736.pdf
2) https://www.researchgate.net/profile/Xue_Cheng_Tai/publication/312519936_Augmented_Lagrangian_method_for_an_Euler's_elastica_based_segmentation_model_that_promotes_convex_contours/links/58a1b9d292851c7fb4c1907f/AugmentedLagrangianmethodforanEulerselasticabasedsegmentationmodelthatpromotesconvexcontours.pdf
3) https://www.researchgate.net/publication/257592616_Image_Segmentation_Using_Euler%27s_Elastica_as_the_Regularization.

INI 1  
09:50 to 10:40 
Thomas Pock Endtoend learning of CNN features in in discrete optimization models for motion and stereo
Coauthors: Patrick Knöbelreiter (Graz University of Technology), Alexander Shekhovtsov (Technical University of Prague), Gottfried Munda (Graz University of Technology), Christian Reinbacher (Amazon) For many years, discrete optimization models such as conditional random fields (CRFs) have defined the stateoftheart for classical correspondence problems such as motion and stereo. One of the most important ingredients in those models is the choice of the feature transform that is used to compute the similarity between images patches. For a long time, hand crafted features such as the celebrated scale invariant feature transform (SIFT) defined the stateoftheart. Triggered by the recent success of convolutional neural networks (CNNs), it is quite natural to learn such a feature transform from data. In this talk, I will show how to efficiently learn such CNN features from data using an endtoend learning approach. It turns out that our learned models yields stateoftheart results on a number of established benchmark databases. Related Links

INI 1  
10:40 to 11:10  Morning Coffee  
11:10 to 12:00 
Dimitris Metaxas tba 
INI 1  
12:00 to 12:50 
Yiqiu Dong Directional Regularization for Image Reconstruction
In this talk, I will introduce a new directional regularization based on the total generalized variation (TGV), which is very useful for applications with strong directional information. I will show that it has the same essential properties as TGV. With automatic direction estimators, we demonstrate the improvement of using directional TGV compared to standard TGV. Numerical simulations are carried out for image restoration and computed tomography reconstruction.

INI 1  
12:50 to 14:00  Lunch @ Wolfson Court  
14:00 to 14:50 
Michael Moeller SublabelAccurate Relaxation of Nonconvex Energies
In this talk I will present a convex relaxation technique for a particular class of energy functionals consisting of a pointwise nonconvex data term and a total variation regularization as frequently used in image processing and computer vision problems. The method is based on the technique of functional lifting in which the minimization problem is reformulated in a higher dimensional space in order to obtain a tighter approximation of the original nonconvex energy.

INI 1  
14:50 to 15:40 
Audrey Repetti Joint imaging and calibration using nonconvex optimization
Coauthors: Jasleen Birdi (Heriot Watt University), Yves Wiaux (Heriot Watt University) New generations of imaging devices aim to produce high resolution and high dynamic range images. In this context, the high dimensionality associated inverse problems can become extremely challenging from an algorithmic view point. In addition, the quality and accuracy of the reconstructed images often depend on the precision with which the imaging device has previously been calibrated. Unfortunately, calibration does not depend only on the device but may also rely on the time and on the direction of the acquisitions. This leads to the need of performing joint image reconstruction and calibration, and thus of solving nonconvex blind deconvolution problems. We focus on the joint calibration and imaging problem in the context of radiointerferometric imaging in astronomy. In this case, the sparse images of interest can reach gigapixel or terapixel size, while the calibration variables consist of a large number of low resolution images related to each antenna of the telescope. To solve this problem, we leverage a blockcoordinate forwardbackward algorithm, specifically designed to minimize nonsmooth nonconvex and high dimensional objective functions. We demonstrate by simulation the performance of this first joint imaging and calibration method in radioastronomy. 
INI 1  
15:40 to 16:10  Afternoon Tea  
16:10 to 17:00 
Christian Clason Convex regularization of discretevalued inverse problems
We consider inverse problems where where a distributed
parameter is known a priori to only take on values from a given discrete
set. This property can be promoted in Tikhonov regularization with the
aid of a suitable convex but nondifferentiable regularization term. This
allows applying standard approaches to show wellposedness and
convergence rates in Bregman distance. Using the specific properties of
the regularization term, it can be shown that convergence (albeit
without rates) actually holds pointwise. Furthermore, the resulting
Tikhonov functional can be minimized efficiently using a semismooth
Newton method. Numerical examples illustrate the properties of the
regularization term and the numerical solution.
This is joint work with Thi Bich Tram Do, Florian Kruse, and Karl Kunisch. 
INI 1 
09:00 to 09:50 
Mila Nikolova Alternating proximal gradient descent for nonconvex regularised problems with multiconvex coupling terms
Coauthor: Pauline Tan There has been an increasing interest in constrained nonconvex regularized block multiconvex optimization problems. We introduce an approach that effectively exploits the multiconvex structure of the coupling term and enables complex applicationdependent regularization terms to be used. The proposed Alternating StructureAdapted Proximal gradient descent algorithm enjoys simple well defined updates. Global convergence of the algorithm to a critical point is proved using the socalled KurdykaLojasiewicz property. What is more, we prove that a large class of useful objective functions obeying our assumptions are subanalytic and thus satisfy the KurdykaLojasiewicz property. Finally, present an application of the algorithm to bigdata airborn sequences of images. 
INI 1  
09:50 to 10:40 
Michael Unser Representer theorems for illposed inverse problems: Tikhonov vs. generalized totalvariation regularization
In practice, illposed inverse problems are often dealt with by introducing a suitable regularization functional. The idea is to stabilize the problem while promoting "desirable" solutions. Here, we are interested in contrasting the effect Tikhonov vs. totalvariationlike regularization. To that end, we first consider a discrete setting and present two representer theorems that characterize the solution of general convex minimization problems subject to $\ell_2$ vs. $\ell_1$ regularization constraints. Next, we adopt a continuousdomain formulation where the regularization seminorm is a generalized version of totalvariation tied to some differential operator L. We prove that the extreme points of the corresponding minimization problem are nonuniform Lsplines with fewer knots than the number of measurements. For instance, when L is the derivative operator, then the solution is piecewise constant, which confirms a standard observation and explains why the solution is intrinsically sparse. The powerful aspect of this characterization is that it applies to any linear inverse problem. 
INI 1  
10:40 to 11:10  Morning Coffee  
11:10 to 12:00 
Pierre Weiss Estimation of linear operators from scattered impulse responses
Coauthors: Paul Escande (Université de Toulouse), Jérémie Bigot (Université de Toulouse) In this talk, I will propose a variational method to reconstruct operators with smooth kernels from scattered and noisy impulse responses. The proposed approach relies on the formalism of smoothing in reproducing kernel Hilbert spaces and on the choice of an appropriate regularization term that takes the smoothness of the operator into account. It is numerically tractable in very large dimensions and yields a representation that can be used for achieving fast matrixvector products. We study the estimator's robustness to noise and analyze its approximation properties with respect to the size and the geometry of the dataset. It turns out to be minimax optimal. We finally show applications of the proposed algorithms to reconstruction of spatially varying blur operators in microscopy imaging. Related Links 
INI 1  
12:00 to 12:50 
Olga Veksler Adaptive and Move Making Auxiliary Cuts for Binary Pairwise Energies
Coauthor: Lena Gorelick (University of Western Ontario) Many computer vision problems require optimization of binary nonsubmodular energies. In this context, local iterative submodularization techniques based on trust region (LSATR) and auxiliary functions (LSAAUX) have been recently proposed. They achieve stateoftheartresults on a number of computer vision applications. We extend the LSAAUX framework in two directions. First, unlike LSAAUX, which selects auxiliary functions based solely on the current solution, we propose to incorporate several additional criteria. This results in tighter bounds for configurations that are more likely or closer to the current solution. Second, we propose movemaking extensions of LSAAUX which achieve tighter bounds by restricting the search space. Finally, we evaluate our methods on several applications. We show that for each application at least one of our extensions significantly outperforms the original LSAAUX. Moreover, the best extension of LSAAUX is comparable to or better than LSATR on four out of six applications. 
INI 1  
12:50 to 14:00  Lunch @ Wolfson Court  
14:00 to 14:50 
Thomas Vogt Optimal TransportBased Total Variation for Functional Lifting and QBall Imaging
CoAuthor: Jan Lellmann (Institute of Mathematics and Image Computing, University of Lübeck) One strategy in functional lifting is to consider probability measures on the label space of interest, which can be discrete or continuous. The considered functionals often make use of a total variation regularizer which, when lifted, allows for a dual formulation introducing a Lipschitz constraint. In our recent work, we proposed to use a similar formulation of total variation for the restoration of socalled QBall images. In this talk, we present a mathematical framework for total variation regularization that is inspired from the theory of Optimal Transport and that covers all of the previous cases, including probability measures on discrete and continuous label spaces and on manifolds. This framework nicely explains the abovementioned Lipschitz constraint and comes with a robust theoretical background. 
INI 1  
14:50 to 15:40 
Martin Holler Total Generalized Variation for Manifoldvalued Data
Coauthors: Kristian Bredies (University of Graz), Martin Storath (University of Heidelberg), Andreas Weinmann (Darmstadt University of Applied Sciences) Introduced in 2010, the total generalized variation (TGV) functional is nowadays amongst the most successful regularization functionals for variational image reconstruction. It is defined for an arbitrary order of differentiation and provides a convex model for piecewise smooth vectorspace data. On the other hand, variational models for manifoldvalued data have become popular recently and many successful approaches, such as first and secondorder TV regularization, have been successfully generalized to this setting. Despite the fact that TGV regularization is, generally, considered to be preferable to such approaches, an appropriate extension for manifoldvalued data was still missing. In this talk we introduce the notion of secondorder total generalized variation (TGV) regularization for manifoldvalued data. We provide an axiomatic approach to formalize reasonable generalizations of TGV to the manifold setting and present concrete instances that fulfill the proposed axioms. We prove wellposedness results and present algorithms for a numerical realization of these generalizations to the manifold setup. Further, we provide experimental results for synthetic and real data to further underpin the proposed generalization numerically and show its potential for applications with manifoldvalued data. 
INI 1  
15:40 to 16:10  Afternoon Tea  
16:10 to 17:00 
Tammy Riklin raviv Variational Methods to Image Segmentation
In the talk I will present variational methods to image segmentation with application to brain MRI tissue classification. In particular I will present an `unconventinal' use of the multinomial logistic regression function. The work is based on a joint work with Jacob Goldberger, Shiri Gordon and Boris Kodner. 
INI 1 