A global archive of independent reviews of everything happening from the beginning of the millennium

Read our Copyright Notice click here

For publication dates click here

* Changes that autonomous vehicles may have to map on the fly can be occasioned by building works. New buildings can also permanently change reference points in older maps.

It is all a bit technical for this page but when future fleets of driverless cars go out their sensors will map changes in the roadscape like building works, feed data back to clouds of GPU-accelerated neural networks which will map and classify the objects in real time and alert other following driverless cars to the changes in the environment.



It is a bit like blind wine tasting. It reveals your capacity for misjudgement.

The programme to Beyond the Fence advises you to read it after you have enjoyed the show because it 'is choc full of spoilers'.

I did this without being bidden as I had decided to try and determine the quality of the computers' contribution to the show without knowing to any significant extent how they had contributed.

I had heard from an American source that a team of machine learning experts in Cambridge had analysed what makes a hit musical and that two of their number had invented a computer lyricist, called Clarissa in the programme, who had trained herself on four GPUs using deep learning. Within days, without having made the connection, I had also attended a lecture given by one of this team but not on the subject of musicals.

I thought a little knowledge of poetry would equip me to analyse the quality of a computer's lyrics. Also, I am interested in how deep learning using cloud GPUs can assist autonomous vehicles map the environment on the fly, comparing it with pre-mapped data in the cloud, and, though not exclusively, how this relates to my own world of architecture*.

So I went to the premiere of Beyond the Fence on 22 February 2016 thinking computers had only contributed to the lyrics but I was mistaken.

Listening I thought perhaps the scenes before the interval had more computer contribution to the lyrics than those after because they seemed more stilted and less readily set to music. Those after were so humorous, flowed so well and so full of human twists and turns that they would have been beyond computers. In this view, I was travelling blind.

Computers had contributed to aspects of the show in five distinct areas:

- Analysing the ingredients of a hit musical - undertaken at the University of Cambridge;

- Providing a creative spark - the motivating theme of the musical - undertaken at Goldsmiths, University of London;

- Writing a musical theatre plot building on analysis provided by students at the Guildford School of Acting;

- Writing lyrics - the cloud lyricist, Clarissa, writing for two of the team at Cambridge;

- Writing melodies and chords but not orchestration, undertaken at Durham University, City University, London and Queen Mary College, University of London.

I really did enjoy the show as a musical and loved the way it set out to dispel so many human prejudices. The cast was very endearing. The list of credits appears to extend into three figures so perhaps I should single out no one.

Just as car manufacturers are moving to getting 'farms' of small robots to assist humans in car assembly, addressing cooperation between machines and people, so this musical is about cooperation between computers and people, the former assisting the latter. The real qualities shown by the musical, though, would be beyond the computers in all of the five areas.

The computer personalities do, however, have to get a mention.

The analysis of the elements that make up a successful musical was made from around 100,000 pieces of data drawn from 1700 West End and Broadway musicals but not by a named programme. An interesting result was that musicals set in the 1980s, though few in number, had a higher chance of success than those set in, say, the 1920s and 1930s and this was influential in the choice of Greenham Common as the locus for the musical.

The What-If Machine, a research programme funded by the European Union, had been tasked with generating original fictional ideas that provide the creative spark, or potential central theme, for a musical.

Some of these were rather good like:

What if there was a little instrument who couldn't make music?

What if there were a little bird who couldn't head south?

but the one chosen was:

What if a wounded soldier had to learn how to understand a child in order to find true love?

The male lead is an officer of the base who is not wounded but does have to understand a child who does not speak and whose mother is the female lead and one of the women protestors. They fall in love from which much of dramatic tension arises.

The generation and evaluation of the idea involved researchers in Slovenia, Ireland, Spain and Britain.

The fictional idea pushed the writers further towards the Greenham Common theme but they exercised a curating and selection function throughout as they did with the music.

PropperWryter is the computer system that wrote the musical theatre plot based on a dataset of musical plot points analysed from 50 hit musicals by the Guildford students.

It is rather good and its flow of headings is:

Aspiration; Decision to take action; Bond strengthening; Character's reaction; Assistance; Loss of loved one; I am what I am; Struggle; Reconciliation; Solution.

The plot really does follow this schemata though some circuitous loops have been taken so as not to over-complicate the main story - for instance, the loss of the loved one is not of one of the three main characters.

Computers have a problem with natural language and the highest number of computations was done by the Cloud Lyricist. Very few of the lyrics written by Clarissa find their way into the programme so it hard to gauge success as her words are dropped into songs crafted by the writers. By my rather exacting standards she is falling short as both a lyricist and poet.

About 25% of the lyrics are computer generated, varying in percentage content from 6% in the song Graceful to 32% in the song So Much to Say. However, 100% of the words in We Are Greenham are from her but it seems unlikely that they were put in order by her and the song is a little stilted.

The London universities' Digital Music Lab was made to 'listen' to as many digital hits and flops from musicals as possible and Durham University's Android Lloyd Webber (ALW) was fed chord data from it and taught about musicals. It then started to turn out show tunes, dozens by the minute. The best melodic themes were extracted by a musical team for use in Beyond the Fence. ALW also wrote music for lyrics.

Later in the process a Parisian computer system called FlowComposer, which works with composers to help the creative flow, joined the team.

It is not quite clear how much of the music could be said to be written by computers but both the lyrics writing and composing are substantial adventures in computational creativity.

The words of the show's creators in the programme would suggest that sparking ideas has been a hit for the computers but actually producing content was much more problematic.

Nonetheless, this has been an historic moment in theatre history and another coda to the events at Greenham Common.