2015

Title: Perspectives in parallel programming

Lecturer: Marco Danelutto, Dipartimento di Informatica

Period:

 May 13, 11-13, Seminari W
 May 15, 11-13, Seminari W
 May 22, 11-13, Seminari W

(the schedule of the next lectures will be decided on May 13)

Title: Bayesian Machine Learning

Lecturer: Guido Sanguinetti, University of Edinburgh

Place:      Computer Science Department, University of Pisa

Period:              

  • 23 February seminar room west  15 - 17;
  • 24 /25 / 26 February seminar room west  9- 11;
  • 27 February Laboratorio didattico Polo Fibonacci "I"  9 - 11;
  • 2 March seminar room west  16 - 18;
  • 3 / 4/ 5 March seminar room west  9- 11;
  • 6 March Laboratorio didattico Polo Fibonacci "I"  9 - 11;

This is a rough summary for the Bayesian Machine Learning course. The main reference is D. Barber's book, Bayesian Reasoning and Machine Learning; the numbers in brackets refer to Barber's book unless explicitly stated otherwise. The book is available online from

http://web4.cs.ucl.ac.uk/staff/D.Barber/pmwiki/pmwiki.php?n=Brml.HomePage

  • Lecture 1: Statistical basics. Probability refresher, probability distributions, entropy and KL divergence (Ch 1, Ch 8.2, 8.3). Multivariate Gaussian (8.4). Estimators and maximum likelihood (8.6 and 8.7.3). Supervised and unsupervised learning (13.1) Lecture 1
  • Lecture 2: Linear models. Regression with additive noise and logistic regression (probabilistic perspective): maximum likelihood and least squares (18.1 and 17.4.1). Duality and kernels (17.3). Lecture 2
  • Lecture 3: Bayesian regression models and Gaussian Processes. Bayesian models and hyperparameters (18.1.1, 18.1.2). Gaussian Process regression (19.1-19.4, see also Rasmussen and Williams, Gaussian Processes for Machine Learning, MIT Press, 2007, Ch 2. Available for download at http://www.gaussianprocess.org/gpml/). Lecture 3
  • Lecture 4: Active learning and Bayesian optimisation. Active learning, basic concepts and types of active learning (B. Settles, Active learning literature survey, sections 2 and 3, available from  http://burrsettles.com/pub/settles.activelearning.pdf.) Bayesian optimisation and the GP-UCB algorithm (Brochu et al, see http://arxiv.org/abs/1012.2599). Lecture 4
  • Lab 1: GP regression and Bayesian Optimisation.
  • Lecture 5: Latent variables and mixture models. Latent variables and the EM algorithm (11.1 and 11.2.1). Gaussian mixture models and mixture of experts (20.3, 20.4). Lecture 5
  • Lecture 6: Graphical models. Belief networks and Markov networks (3.3 and 4.2). Factor graphs (4.4). Lecture 6
  • Lecture 7: Exact inference in trees. Message passing and belief propagation (5.1 and 28.7.1). Lacture 7
  • Lecture 8: Approximate inference in graphical models. Variational inference: Gaussian and mean field approximations (28.3, 28.4). Sampling methods and Gibbs sampling (27.4 and 27.3). Lecture 8
  • Lab 2: Bayesian Gaussian mixture models.

Title: Verifiable voting systems and secure protocols: from theory to practice

Lecturer: Peter Y. A. Ryan, Université du Luxenbourg

Period: 6-10 July 2015 --- Sala Seminari W, 9-12

Title: Coinductive Methods in Computer Science (and beyond)

Lecturer: Filippo Bonchi and Damien Pous, ENS Lyon

Period: 13 -- 24 April -- all lectures will be in Sala Seminari W

Title: High Dynamic Range Imaging: theory and applications

Lecturer: Francesco Banterle, Visual Computing Laboratory, CNR Pisa

Period: 15-26 June 2015 -- Sala Seminari Ovest, 15-17

Reference page: http://www.banterle.com/francesco/courses/2015/hdri/index.php

Title: Searching by Similarity on a Very Large Scale

Lecturer: Giuseppe Amato, CNR Pisa

Period: end of September/beginning of October 2015

Title: Sistemi di tipi calcoli di processi e di sessione

Lecturer: Ilaria Castellani, Rosario Pugliese

Period: October 2015

Place: Università di Firenze, Viale Morgagni

Title:Scientific writing in English

Lecturer: Steven Shore, Department of Physics

Period: February 3, 4, 5, 6

In addition, each student can attend a course of the Master programme in Computer Science, for instance: