David's subject was the development of statistical algorithms which would look at the output of "sensors" attached to some machine or person, and try to deduce from their outputs collectively whether their host was behaving abnormally and therefore needing maintenance attention. He conceded that the computer was unlikely to be better than an expert human observer. But the permanent presence of a human expert is expensive, and the computer can be used to assist experts by guiding their attention to significant events. The problem was to attempt to represent human expertise "in a box".
His original objects of study had been aircraft engines, e.g. in the Eurofighter, Airbus 380 or the still-to-fly Boeing 787. A jet engine might have 20 sensors on it. A human presented with a simultaneous display of all these would have some difficulty interpreting them, and the quantities of data obtained from the sensors can be substantial. One flight might generate 500 Mbytes. A single test run of a racing car might generate ten times that. Even the data generated by hospital patients becomes substantial when whole wards are monitored for long periods. Thus automated methods are required.
It was useful to distinguish between model-based and data-based approaches. Oxford is active in generating computer models of biomedical systems, based on modelling single system components using a priori knowledge. A data-based system, on the other hand, provides only a moderate understanding of the system itself, but models the behaviour of a whole set of sensor outputs. Primitive systems that look only at a single sensor can usually do little more than set thresholds, and sound an alarm if they are exceeded. The result tends to be so many false alarms that the alarms are ignored, or just turned off. One can do much better by looking at multiple sensors simultaneously, but some automatic learning is then necessary, specific to the system in question, to deduce what combinations are normal, and which are dangerously abnormal. Complete failures tend to be rare in modern complex systems. So the behaviour of the system in "normal" conditions is modelled, and then events that are statistically "abnormal", compared with that model, are used to generate alerts. There is also the problem of reducing multiple variables down to two or three so that they can be displayed graphically on a screen to a mere human!
David quoted various examples of the application of this method. One was of a jet engine which had somehow acquired a loose nut rolling around inside it. Its vibration spectra could have predicted its eventual failure well before it actually happened. The change was very subtle, and only detectable using statistical methods, rather than conventional alerts. A medical application, in which alarms were based not on individual sensors, of temperature, pulse rate, blood pressure etc., but on a learnt combination of them all, resulted not only in a great drop in false alarms, but in reducing from 50 to zero the number of cardiac arrests over an 18-month period during clinical trials.
The audience responded to the talk with a lively questioning session.
|<< Previous article
|Next article >>
|SOUE News Home
Copyright © 2010 Society of Oxford University Engineers