July - December 1997

**Organisers**: C M Bishop (*Aston*), D Haussler (*UCSC*), G E Hinton (*Toronto*), M Niranjan (*Cambridge*), L G Valiant (*Harvard*)

The last few years have seen a substantial growth of research activity in machine learning, focusing in large part on neural network models. For most applications of machine learning the central issue is that of generalization.

This NATO ASI will provide a comprehensive and coherent tutorial programme aimed at research scientists at postdoctoral level and beyond, though it will also be accessible to advanced graduate students having a good mathematical background.

**The complete programme is available below**

*NB Please note change of schedule for Thurs 7th and Weds
13th (updated 7/8/97)*

**Organising Committee:
**Director: C
M Bishop (Aston)

J M Buhmann (Bonn), G E Hinton (Toronto), M I Jordan (MIT)

**Lecturers: **

E Baum (NEC) | R Neal (Toronto) |

C M Bishop (Aston) | DJC MacKay (Cambridge) |

L Breiman (Berkeley) | B D Ripley (Oxford) |

J M Buhmann (Bonn) | E Sontag (Rutgers) |

P Dayan (MIT) | N Tishby (Jerusalem) |

G E Hinton (Toronto) | L G Valiant (Harvard) |

T Jaakkola (UCSC) | V Vapnik (AT&T) |

M I Jordan (MIT) | C K I Williams (Aston) |

Y Le Cun (AT&T) |

**Sunday 3 August**

18.00 Welcome reception in the Isaac Newton Institute, and registration

19.00 Dinner in Wolfson Court (for Wolfson Court residents only)

**Monday 4 August**

08:30 Registration

09:00 Bishop (1) *Supervised Learning in Linear Models*

10:30 Coffee

11:00 Williams (1) *Supervised Learning in Non-linear Models*

12:30 Lunch

14:00 Breiman (1) *Instability, bias-variance and regularization*

15:30 Tea 16:00 LeCun (1) *Generalization in high-dimensional tasks*

17:30 End of session

**Tuesday 5 August**

09:00 Bishop (2) *Model Complexity and Generalization*

10:30 Coffee

11:00 Neal (1) *An illustrative research endeavour: The motivation,
the idea, an empirical test, and the final conclusions*

12:30 Lunch

14:00 MacKay (1) *Introduction to Gaussian processes*

15:30 Tea 16:00 Dayan (1) *Unsupervised Learning: Modelling probability
distributions*

17:30 End of session

**Wednesday 6 August **

09:00 Hinton (1) *Almost perfect generalization with almost no labelled
training data *

10:30 Coffee

11:00 Buhmann (1) *Unsupervised learning and clustering *

12:30 Spotlight presentations of selected posters

13:00 Lunch

Afternoon free for sightseeing, punting etc.

17:00 Wine reception, with poster contributions from participants

**Thursday 7 August **

09:00 Tishby (1) *Statistical physics and phase transitions in learning
and generalization*

10:30 Coffee

11:00 Baum (1) *MultiAgent Economics and Reinforcement Learning*

12:30 Lunch

14:00 Breiman (2) *Combining estimators*

15:30 Tea

16:00 Freund (1) *Introduction to Boosting*

17:30 End of session

Friday 8 August

09:00 Neal (2) *Monte Carlo methods and their application to Bayesian
neural network learning*

10:30 Coffee

11:00 Williams (2) Generalization in Gaussian processes

12:30 Lunch

14:00 Tishby (2) *Towards a statistical theory of representation in
learning *

15:30 Tea 16:00 Baum (2) *The Economics of Metalearning*

17:30 End of session

Saturday 9 August

Free day

**Sunday 10 August**

Coach tour of Woolsthorpe Manor (the birth place of Isaac Newton) and the town of Lincoln

**Monday 11 August**

09:00 Jordan (1) *Introduction to graphical models I*

10:30 Coffee

11:00 Jaakkola (1) *Introduction to graphical models II*

12:30 Lunch

14:00 Hinton (2) *Improving generalization by minimizing the description
length of the weights*

15:30 Tea

16:00 Dayan (2) *Correlations and probabilities in population codes
*

17:30 End of session

**Tuesday 12 August **

09:00 Jordan (2) *Variational methods for graphical models I *

10:30 Coffee

11:00 Jaakkola (2) *Variational methods for graphical models I*

12:30 Lunch

14:00 Buhmann (2) *Active learning*

15:30 Tea

16:00 MacKay (2) *Information theory, error-correcting codes and belief
networks*

17:30 Pre-dinner drinks reception in the Isaac Newton Institute

**Wednesday 13 August**

09:00 Vapnik (1) *The statistical nature of learning theory*

10:30 Coffee

11:00 Valiant (1) *Introduction to Computational Learning Theory*

12:30 Lunch

14:00 Le Cun (2) *Gradient descent dynamics and generalization*

15:30 Tea

16:00 Ripley (1) *Statistical theories of model fitting*

17:30 End of session

**Thursday 14 August**

09:00 Sontag (1) *The VC and related dimensions for static neural
networks *

10:30 Coffee

11:00 Valiant (2) *Recent developments in learning theory*

12:30 Lunch

14:00 Ripley (2) *Are uniform convergence results practically relevant?*

15:30 Tea

16:00 Freund (2)* On-line sequence prediction*

17.30: Conference dinner in Corpus Christi College

**Friday 15 August**

09:00 Sontag (2) *The VC and related dimensions for dynamic neural
networks *

10:30 Coffee

11:00 Vapnik (2) *Support vector networks*

12:30 Lunch

14:00 Panel: panel discussion and wrap-up

15:00 End of workshop