BBC R&D tests AI in live audience setting

The BBC’s research unit, BBC R&D, is testing how AI and machine learning can be applied in live audience environments by conducting an experiment at the BBC, LFCI and Alan Turing Institute’s conference on AI, Society and the Media.

The goal of the experiment is to gain a greater understanding of how AI and machine learning can be applied in production in live studio environments, according to senior R&D engineer Stephen Jolly in a blog posting.

Jolly said that for AI to be sued to control cameras in basic situations and free up camera operators for more elaborate shots, it is necessary for computers to be able to discern what is interesting for audiences.

The R&D department is usign the conference to film part of the audience with a view to analysising patterns in the way audience members direct their attention, in order to give AI algorithms data about what be interesting on stage at any given time.

Jolly said that it is not yet clear how useful the experiment will be, with lighting levels and the challenge of getting sufficient numbers of people in focus among the factors that could impede it. The BBC will analyse the recordings after the event to investigate whether patterns of audience attention could be identifiable and useful.

 

Read Next