This was our final week of rehearsal. Our last chance to get things in order before we show our project to the ETC faculty (and to the world)!
Truth be told, it was kind of a slog. While everything ultimately came together relatively close to how we wanted it, there were moments of Sisyphean frustration, where it felt like we were pushing the technology well beyond what it was capable of. This manifested in some of the following ways:
- The HoloLenses were draining power more quickly than they could be charged. Our small team of 6 people had a lot to manage this week, but it turned into an act of spinning plates to keep these devices powered. Even when plugged directly into an electrical outlet, our networked experiences was so resource intensive that by the end of the evening, some of the HoloLenses would be nearly or entirely dead. A bit scary heading into performance week!
- We’d have long periods during rehearsals where the tech wasn’t cooperating and we had to hold the actors. Will and Amara (Peter and Wendy, respectively) developed a special bond as they waited…and waited…and waited some more while we ironed out problems and could start running the scene again.
- Our livestreaming/recording solution was not looking great. With our dream of Spectator View (the solution proposed by Microsoft itself for high-quality capture) evaporating because of compatibility issues with out project, we were planning to record and stream directly from the HoloLens’ 720p camera. The result was a pixelated mess that (while low in latency) wouldn’t do our animations justice. Will, at one point trying to send his friends a photo of what he was working on, held his phone up to the HoloLens’ lens and the quality of the capture was good enough to get us thinking…we attempted to use Google Chromecast and an Android phone, and while the hologram quality was much, MUCH better, the lag between what was happening in the HoloLens and what would appear on the monitors was too long for acceptable livestreaming.It turned out, however, that an Apple iPhone captured similarly high-quality versions of the holograms and, when mirrored to an Apple TV connected to the monitors, also produced a low-latency livestream. And thus our streaming and recording solution was born: an iPhone duct-taped inside a HoloLens on top of a(n unused) DSLR camera on top of a tripod. We’ll be explaining all of this more in our final presentation.
- The HoloLens clicker, our means of advancing Tink’s cues, was proving to be inconsistently unreliable. At random times during our run-throughs, a click would do absolutely nothing and Tink would remain in her previous position/idle, leaving our actors hanging and unsure of what to do next.
Will we fix the dreaded clicker catastrophe? Will all of our HoloLenses die right when we place them on our actual audience members’ heads? Stay tuned for next week, as the curtain finally rises (and falls) on “Project Neverland!”