Summer School on

Domain Adaptation in Image Analysis

Copenhagen, Denmark, August 20-24, 2012

Schedule

The Summer School will contain lectures and exercises mainly covered by the four invited speakers. Local speakers will provide examples of domain adaptation in their research and relate the topic to the research at DIKU and DTU Informatics. In addition to the scientific program there will be a Summer School dinner Thursday evening.

A tentative program is available as pdf here.

Invited Lecturers

Corinna Cortes

The lectures will cover:

  • Variations of support vector machines (SVMs) for domain adaptation.
  • Boosting methods for domain adaptation.

Suggested reading material:

Mehryar Mohri

Lecture 1: Convex Optimization.
This lecture covers:

  • convex optimization (definition, properties, theorems).
  • linear programming (LP).
  • quadratic programming (QP).
  • semidefinite programming (SDP).

Suggested reading material:

  • Convex Optimization. Boyd, Stephen P. and Vandenberghe, Lieven. Cambridge University Press, 2004.
  • Foundations of Machine Learning. Mohri, Rostami, and Talwalkar, MIT Press, 2012. (Note that every attendant of the Summer School receives a copy of this book.)


Lecture 2: Adaptation in Regression: Generalization Bounds and Algorithms.
This lecture covers:

  • learning bounds for regularization-based algorithm.
  • discrepancy minimization algorithm and optimization.
  • empirical results.

Suggested reading material:

Yishay Mansour

Lecture 1: Generalization Bounds
This lecture covers:

  • Review of generalization bound (VC dimension, Radamacher complexity)
  • Single domain source adaptation bounds: discrepancy distance, bounded VC dimension bounds. Re-weighing source examples.

Suggested reading material:


Lecture 2: Multi-source Adaptation
The motivation is based on the observation that in many cases you have huge amounts of unlabeled data from the target domain, and fairly good classifiers from related domain. The goal is to combine those classifiers to predict in the new target domain, for which we have only unlabeled examples. The surprising result is that there exists a class of combination rules which guarantees, in certain aspects, to transfer the predictability power of the source domains to the new target domain.

Suggested reading material:

Trevor Darrell

Lectures: Domain Adaptation in Object Recognition

Domain adaptation is an important emerging topic in computer vision. I'll review recent studies of domain shift in the context of object recognition, and cover new methods that adapt object models acquired in a particular visual domain to new imaging conditions by learning a transformation that minimizes the effect of domain-induced changes in the feature distribution. Transformations may be learned in a supervised manner and can be applied to categories for which there are no labeled examples in the new domain. While I'll focus the presentation on object recognition tasks, the transform-based adaptation technique is general and could be applied to non-image data. I'll also describe methods for multi-source domain adaptation, and adaptation in the context of discriminative learning frameworks, as well as available online datasets relevant to vision-based domain adaptation.


Tutorial: Domain Adaptation in Object Recognition

In this practical exercise, we will explore domain adaptation problems and standard algorithms on the PASCAL motorbike->bike (Ref. 1) and the Office dataset (Ref. 2). The exercise is organized by Trevor Darrell and held by Kersten Petersen.

Suggested reading material:

Guest Lecturers

Tobias Glasmachers

Lecture: Multi-class Support Vector Machines
Extending the standard Support Vector Machine classifier to problems with more than two classes is a non-trivial issue, for which a multitude of solutions has been proposed. The lecture will be a round trip through the diverse domain of large-margin multi-category classification. We will see how most approaches can be understood systematically within a unifying framework.

Suggested reading material:

The audience should be familiar with "standard" support vector machines for binary classification as presented in the introductory lectures.


Tutorial: Machine Learning with the Shark

The "Shark" machine learning library is a modular C++ library for the design and optimization of adaptive systems. In this hands-on practical course on machine learning with C++ we will go through the different steps of the machine learning processing chain: importing data, pre-processing, training a model, optimizing parameters, and testing the final performance.

Important: It is highly recommended that participants have the Shark library, Version 3, pre-installed (and example programs tested) on their laptop computers. Installing shark requires nothing but a modern C++ compiler and a recent version of the boost libraries. The Shark library is ready for download from

http://image.diku.dk/shark/

Suggested reading material:

  • Christian Igel, Verena Heidrich-Meisner, and Tobias Glasmachers: Shark. Journal of Machine Learning Research 9, pp. 993-996, 2008

Marco Loog

Covariate Shift : Some Theory, Some Examples, and Some Observations

Suggested reading material:

Local Lecturers

Christian Igel

Lecture 1: Supervised Machine Learning

Machine learning is about developing and analyzing algorithms that automatically improve with experience. Such algorithms are already an integral part of today's computing systems, for example in search engines, recommender systems, or biometrical applications. This short lecture will provide a gentle introduction to supervised machine learning.

Suggested reading material:

  • Christian Igel. Machine Learning: Kernel-based Methods. Lecture Notes, 2012

  • Lecture 2: An Introduction to Markov Random Fields and Restricted Boltzmann Machines

    The tutorial considers Markov random fields (i.e., undirected graphical models) with a special focus on restricted Boltzmann machines (RBMs). In the first part, undirected graphical models are introduced, where image denoising serves as an application example. The second part concentrates on RBMs, which are undirected graphical models describing stochastic neural networks. Recently, RBMs have raised attention as building block of deep belief networks (DBNs).

    Marleen de Bruijne

    Domain adaptation in medical imaging

    This talk will cover current approaches to coping with common variations in medical imaging data as well as a proof-of-concept study of transfer learning for medical image segmentation.

    Summer School on Domain Adaptation in Image Analysis
    Copenhagen, Denmark, August 20-24, 2012
    Dina Riis Egholm dinariis@diku.dk