Little Magic Stories
Bringing the imagination of children to life through storytelling, performance and technology.
This installation aims to encourage children to use their creativity to bring stories to life. It helps to improve their confidence in self expression and develops literacy and speaking skills. The installation allows them to create a performance from within their imagination, on stage, in front of an audience of family and friends.
The use of technology literally brings their drawings to life on stage, allowing them to interact and respond to their creations in real time. By using a holographic projection film; sets, characters and objects appear to float on stage alongside the performers. A camera and custom software track the performers, allowing the scene to react in a playful & more dynamic way.
For children who show disinterest in writing stories and drawing, but love to play video games, this project hopes to inspire them to participate, creating their own immersive worlds that they can be proud to express their own creations through.
This is the first version of the project to test the idea and build the system. This story about the seasons was created entirely by the children, with the interactivity in the scenes built by me. Some scenes used motion detection in zones to trigger animations, such as catching Easter eggs, squashing sand castles or launching fireworks. Body tracking and basic physics were used in other scenes.
I am planning to use this project in workshops with groups of children to get them excited about storytelling. They will be able to use the system to create their own narratives, as well as drawing the content by hand, before performing to their friends.
The system will have improved physics, dynamic animation of objects and scene animated sounds.
I used the Musion Eyeliner holographic projection system for this project, allowing the graphics to appear to be alongside the performers. This uses a technique called Pepper’s ghost, and you can see the technical set-up here.
An Xbox Kinect camera was used to track the performers on stage. The Kinect was preferred for use over a normal camera for a variety of reasons. Firstly as a depth camera, I can tell when the performers are near the front of the stage, and therefore level with the graphics in terms of projection plane. Also as the depth works in the IR spectrum, it ignores the projected image and stage lighting that can change throughout different scenes.
The software was custom written in C++ and used openFrameworks, openCV and Box2D.
On Flickr: Making Of & Performance Shots.
Thanks to Musion for the screen time, and of course Jack, Louie & their Dad.