Hello everyone, As we wrap up, we thought we would share some of the lessons we learned. Dedication: Our team was incredibly dedicated to the project. Animation is difficult and inevitably there will be crunch time. What got us through was the fact that everyone is incredibly passionate about making this project the best it could […]
Week 4 is our final week of preparation before Quarters. Quarters is when the faculty get a chance to see the intent of our project, the evidence of our research, and the level of coordination we are prepared to enact in achieving our goal. Effectively, it’s like a pitch. Therefore, this entire week has been spent insuring that we have the following things ready to show them:
- A full concept bible with story, storyboard, and supporting art
- A 360° animatic that we can use to walk through the scene
- A demonstration of our ability to take motion capture data through the animation pipeline
- A spatialized sound demonstration
Concept Bible
After reviewing our boards and playtesting our story, we discovered that aspects of the emotional journey were still falling flat. In this instance, our hero, Manny, wasn’t having much of an emotional journey at all (more of a gentle, flat stroll). To solve this problem, we decided to stop talking and start doing: We took a 360° camera, stood around it, and took turns pretending to be Manny as he journeyed from culture to culture. Our reasoning was: If we couldn’t recognize our own feelings in the experience, how could we expect our audience to relate? Now we have a more fleshed out map we can run by faculty.
In turn, we finished designing our environments, taking into account where every object in the room would be situated. Our concept artist did research into each culture and, in turn determined which objects in the room would “transform” into what new things when a particular culture starts dancing. Going into this week, we created a color script that our artist will then use to create sketches our modelers can match when lighting and texturing.
360º Animatic
Showing the 360º animatic demonstrates a few things: 1) that we have taken a design from modeling through to rendering; 2) that we are thinking about placement in the full-immersion space, even if we cannot draw in it; and 3) that we are prioritizing playtesting and iteration over having something refined early.
Thanks to our amazing modelers, we currently have a fully-rendered version of the animatic with low-poly versions of the characters and the environment. Of course, going through this process helped us realize something we had been warned of very early: plan around the render farm.
First, we struggled to use a new plug-in for 360-capture. The farm’s software was initially incompatible, and thanks to some serious tech support and some finagling with the developers of the plug-in, we were able to get the farm up and running… two weeks later than we had initially planned.
Then, upon running tests on our 46-node farm, we came to the realization that at 60 frames-per-second (the minimum necessary for comfort in VR), the render farm would need two and a half days to complete a barely colored, barely lit 4 minutes of film.
As a result, we’re now going to run tests to see what happens if we put a 24 frames-per-second film in a headset. If no one vomits while watching, we might have to switch, simply to save time. We are also considering how we can break up the single scene into shorter chunks to “lock in” certain sections and avoid full-film renders.
Motion Capture Tests
Now that we have all of our test data cleaned up, our tech artist was able to tie the motions into Manny’s low-poly model. While this process was relatively straight-forward, seeing the model in motion gave us plenty of insight on what we will need from our dancers. For instance, upon seeing Manny walk, we instantly could tell the test dancer was female by the hip swing. Being conscious of these small things when directing in the motion capture studio is going to make all the difference in having a believable character on the screen.
Having been informed by dialogue over the story and animatic, our tech artist is preparing a complementary complex rig with which to animate the characters by hand. We are all considering when use of motion capture vs. animation will be appropriate for the transitions between dances. We will be testing this going into the coming week.
Spatialized Sound
Our sound artist worked extremely hard to figure out the sound technology. As a result, he proposed that down the line we will have to decide whether our target technology for viewing is an android phone or Oculus – with each having different systems for handling the nuance of sound in 360º.
We have also been fortunate to “lock-in” two of our three cultural dance and music resources. More specifically, dancers from the University of Pittsburgh’s Nrityamala Indian Dance Troupe have been consulting with us. They will be choreographing a piece for us, but they have also invited us to sit in on and record their rehearsals as they prepare for an upcoming competition. This has given us a unique opportunity to sample the 360º microphone we have at present – mainly to see if we can use it as a “stand in,” should the nicer equipment fail – and begin integrating sound more indicative of our intended style into the prototypes.
Going Into Quarters
Next week, after we receive our critiques from faculty, we will begin to answer the following questions.
- What is the lowest frames-per-second we can get away with in 360º?
- What is our compartmentalization of the film and subsequent naming convention for frame & cut organization?
- How much animation do we want to do via motion capture, and how much by hand?
- What techniques of motion, light, and sound can we use to both direct the eye AND convey narrative in a space without camera frames or dialogue?