Interactive Storytelling

Print

The changing landscape of media technology in the home signifies the need for providers to evolve accordingly. The broadcaster must change to meet the needs of today's audiences. We recognise the inherent needs of users to interact in with content in a natural way, perhaps as they would interact with each other. Pressing the red button on an iTV remote does not satisfy that need.
With storytelling being at the core of the public service broadcaster's remit, how do we populate highly interactive domains with content rich in narrative? Two strawman Proof-of-Concepts explored in CALLAS this in different ways: Interactive Storyteller and Interactive Drama.

Interactive Storyteller

A story is often conveyed so as to allow the audience to experience the journey from the mind's eye of one of the story characters:
 
The audience, to a greater or lesser extent, experiences the sequence of affective states that are felt by the character over the course of the various relationships within the story.
The skill of a storyteller to convey the felt experience is therefore highly crucial. In live oral storytelling (ie of the oral traditions), the storyteller is able to read the affective state of the audience. The storyteller is thus able to improvise accordingly to bring the audience through the desired sequence of affective states.

With books, film, and other traditional broadcast media, the story is invariant and cannot be improvised. The same is true for present-day 'interactive TV'.
However, with the use of the the CALLAS Shelf technologies, we can change this. By so doing, we aim to appeal to the audience's innate, almost unconscious need to participate in the storytelling experience, bringing back the live elements of the oral traditions. See poster here.
A first mock-up application was shown at BBC Festival of technology
The Emotional Attentive Agent as storyteller adapts the way the story is told according to the inferred affective characteristics of the audience at any point in time. Coupled with this, an image display shows pictures that are associated with the story as it unfolds.
When the application begins, the user is presented with the image for the first scene. The ECA invites the user to comment about the image, and the user says some words about the scene. Then the ECA conveys the story of the scene to the user, in a way that is appropriate for the inferred mood of the user. At the beginning of the next scene, the image for that scene is presented. Once again, the ECA invites the user to say some words about the scene. This turn-based interaction continues through to the end of the story.
Content for the application is provided in the form of a sequence of images, tagged with narrative and ideal affective states of the user. Input to the system is captured each time a new image is shown, in the form of speech and video. These are used to estimate the user's affective state, which is then compared with the ideal affective state related to that point in the story. The result is used to determine the expressitivity of the ECA's words and gestures for the next segment of the story.  We have chosen 2 genres for this application: User-generated news (in the form of sets of images that are tagged with narrative) and Fact-based historical dram.
See also here and the CALLAS paper:

Interactive Drama

In most Interactive Storytelling systems, user interaction is based on natural language communication with virtual agents, either through isolated utterances or through dialogue. Natural language communication is also an essential element of interactive narratives in which the user is supposed to impersonate one of the story's characters. Whilst techniques for narrative generation and agent behaviour have made significant progress in recent years, natural language processing remains a bottleneck hampering the scalability of Interactive Storytelling systems.

We introduce a novel interaction technique based solely on emotional speech recognition. It allows the user to take part in dialogue with virtual actors without any constraints on style or expressivity, by mapping the recognised emotional categories to narrative situations and virtual characters feelings. Our Interactive Storytelling system uses an emotional planner to drive characters' behaviour. The main feature of this approach is that characters' feelings are part of the planning domain and are at the heart of narrative representations. The emotional speech recogniser analyses the speech signal to produce a variety of features which can be used to define ad-hoc categories on which to train the system.
Emo Emma
The content of our interactive narrative is an adaptation of one chapter of the XIXth century classic novel "Madame Bovary", which is well suited to a formalisation in terms of characters' feelings.
At various stages of the narrative, the user can address the main character or respond to her, impersonating her lover.
The emotional category extracted from the user utterance can be analysed in terms of the current narrative context, which includes characters' beliefs, feelings and expectations, to produce a specific influence on the target character, which will become visible through a change in its behaviour, achieving a high level of realism for the interaction.
Suggested reading is the paper:
Last Updated on Wednesday, 04 August 2010 21:25