Bayesian Models and Learning Through Evolution

Date: 
Monday, September 25, 2017 - 18:30
Source: 
London Machine Learning
Attendees: 
230
City: 
London

Agenda:

- 18:30:  doors open, pizza, beer, networking

- 19:00: First talk

- 20:00: Break & networking

- 20:15: Second talk

- 21:30: Close


• Leveraging expert knowledge: developing and deploying probabilistic graphical models for commercial insurance clients - Anna Schroeder

Commercial insurers cover ‘unusual’ risks – not your or my home insurance, but that of a large car manufacturer’s production site, a fleet of aircrafts, art works, satellites. The list is long and no insurance solution is the same as another. To price risks, we cannot rely on a vast data set of existing customers’ behaviour. Instead, we work with risk engineering experts who specialise for instance in the assessment of fire risks in commercial real estate, such as manufacturing sites.

In this talk we will present an end-to-end approach leveraging our in-house expertise to provide a new service to commercial insurance clients. We discuss combining knowledge elicitation methods with data to build probabilistic graphical models. The graphical models are built into a functional programming library with a range of inferential interpreters. We describe the deployment of this model within an app that ultimately allows our clients to understand how infrastructure investments in their property can reduce the risk within a fire damage use case.

Bio: [to be added]

• How Evolution can Learn - Richard Watson

A analogy between evolution and simple types of learning has often been noted. Both are adaptive processes that can be conceived as simple trial and error, incremental improvement and/or reward maximisation mechanisms. Yet, learning is often associated with intelligent behaviour (artificial and natural) whereas evolution by natural selection is a famously blind, relentless but plodding, process - evolution is not smart. So says the conventional view. Curious then that its results are so spectacular. Is that just a matter of time and 'computational' resources, or is the analogy with learning deeper than it initially appears to be? A series of recent works show that evolution by natural selection is formally equivalent to more sophisticated types of learning. For example, the action of selection on heritable variation in gene-regulatory connections is equivalent to associative learning principles familiar in neural networks. This helps us understand how the evolutionary process can accumulate knowledge about the selective environment that, in addition to refining specific solutions, also refines its ability to evolve (by changing the distribution of variants that it samples). More radically, the evolutionary transitions in individuality (e.g. from unicellular life to multicellular organisms) suggests an ability for evolution to scale-up the units in which it operates through multiple levels of biological organisation. The relationship between such 'deep evolution' and deep learning approaches is not superficial. Taken together these links with learning suggest that the gap between the conventional 'blind' view of natural selection and the surprising cleverness of its results may be due to the fact that biological evolution is actually significantly smarter than we realised.

Bio: Richard A. Watson is an associate professor in the Agents, Interaction and Complexity research group at the University of Southampton's School of Electronics and Computer Science, and a member of the Institute for Life Science, Southampton. He received his BA in Artificial Intelligence from the University of Sussex in 1990 and then worked in industry for five years. Returning to academia, he chose Sussex again for an MSc in Evolutionary and Adaptive Systems, where he was introduced to evolutionary modeling. His PhD in computer science at Brandeis University (2002) resulted in 22 publications and a dissertation addressing the algorithmic concepts underlying the major transitions in evolution. A postdoctoral position at Harvard University's Department of Organismic and Evolutionary Biology provided training to complement his computer science background. He now has over 100 publications on topics spanning evolutionary biology, evolutionary computation, population genetics, neural networks and computational biology. He is the author of Compositional Evolution: The Impact of Sex, Symbiosis, and Modularity on the Gradualist Framework of Evolution (MIT Press, 2006). 

AHL

AHL Riverbank House, 2 Swan Lane, EC4R 3AD