This week was the culmination of a lot of research and extensive prototyping, the results of which has allowed us to definitively move forward on a story/script. Team TheatAR is therefore proud to announce that we will be staging a scene from J.M. Barrie’s classic Peter Pan, featuring an AR-animated Tinker Bell! The team is referring to the entire process of successfully staging this scene with its myriad technical challenges as Project Neverland, and here is our glorious promotional artwork*:
So why Tinker Bell? After spending time testing augmented objects interacting with a physical space, the team has observed that a flying character would be well-suited for this kind of project because a character that mainly floats and hovers is significantly more believable than a character that predominantly walks on the ground. Additionally, Tinker Bell is relatively small in size and can fit comfortably even in the most narrow field of view (a limitation most of the available hardware is plagued with). With those parameters in mind, Tinker Bell emerged as a leading candidate and staging a scene from Peter Pan allows us to show the magic of an animated character having real presence on the stage, interacting with the physical set as well as the human actors that appear in the scene.
In pretty much all previous productions of Peter Pan, the character of Tinker Bell is performed by a little darting light effect that moves around the stage. While this was serviceable in 1904 when Barrie’s play first premiered, our approach aims to allow this character to generate real emotion, as we’ll be able to see her facial expressions and body language, much like one would see with an animated character in film. This will allow this character to in fact become an actual, meaningful character as opposed to an amorphous lighting trick.
We’ll be presenting our findings and progress thus far to the ETC faculty next Wednesday, but the results of our hardware tests point in favor of using one of the headsets as opposed to mobile AR for a couple of key reasons:
1) While mobile AR is capable of producing high-fidelity images on screen, the AR objects/characters cannot reliably be counted on to remain exactly where they’re placed; an untenable drift occurs when the device is moved around. Because we require our character to move to precise locations on our stage, we can only tolerate a very minimal amount of positional drifting.
2) Holding up a phone or a tablet to watch a live theater experience is unsatisfying as an audience member. Mobile AR certainly has its applications in live performance (perhaps ideally when an audience member can move around a space and use the device as a “magic eye” to see things that can’t be seen by the naked eye), but for a seated performance where the action occurs at a fixed distance from the audience, holding a device in one’s hands creates an aesthetic distance between the audience and performers that feels akin to watching a movie, even though it’s technically live.
The existing AR headsets have their drawbacks (the limited field-of-view is certainly the most damning), but for our particular needs and as of this writing, they appear to be the superior option.